Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

flaky test: TestSnmpReceiverIntegration #21086

Closed
codeboten opened this issue Apr 20, 2023 · 8 comments
Closed

flaky test: TestSnmpReceiverIntegration #21086

codeboten opened this issue Apr 20, 2023 · 8 comments

Comments

@codeboten
Copy link
Contributor

Component(s)

receiver/snmp

Describe the issue you're reporting

Seen here: https://github.com/open-telemetry/opentelemetry-collector-contrib/actions/runs/4752971094/jobs/8444004878?pr=21085

--- FAIL: TestSnmpReceiverIntegration (39.14s)
    --- FAIL: TestSnmpReceiverIntegration/Integration_test_with_v2c_configuration (16.02s)
        integration_test.go:91: 
            	Error Trace:	/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/snmpreceiver/integration_test.go:91
            	Error:      	Received unexpected error:
            	            	number of resources doesn't match expected: 4, actual: 0
            	Test:       	TestSnmpReceiverIntegration/Integration_test_with_v2c_configuration
    --- FAIL: TestSnmpReceiverIntegration/Integration_test_with_v3_configuration (11.02s)
        integration_test.go:91: 
            	Error Trace:	/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/snmpreceiver/integration_test.go:91
            	Error:      	Received unexpected error:
            	            	number of resources doesn't match expected: 4, actual: 0
            	Test:       	TestSnmpReceiverIntegration/Integration_test_with_v3_configuration
FAIL
@codeboten codeboten added the flaky test a test is flaky label Apr 20, 2023
@github-actions
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@TylerHelmuth
Copy link
Member

@djaglowski
Copy link
Member

@StefanKurek, any idea what could be going wrong here?

@StefanKurek
Copy link
Contributor

@djaglowski not quite sure right off the bat. Also not quite sure "flaky" is the accurate word here. It looks like the test started failing at some point in time. From that point on CI consistently failed until this test was turned off.

@StefanKurek
Copy link
Contributor

@djaglowski ok that's what I get for putting something down on paper. Going back far enough I did see sporadic behavior that would qualify this as "flaky"

@StefanKurek
Copy link
Contributor

Alright, I'll be looking at this a little deeper sometime within the next week.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jul 17, 2023
@github-actions
Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants