Skip to content

[apache_spark][nodes] Add Apache Spark package with Nodes data stream #2939

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Apr 4, 2022

Conversation

yug-rajani
Copy link
Contributor

@yug-rajani yug-rajani commented Mar 30, 2022

What does this PR do?

  • Generated the skeleton of Apache Spark integration package.
  • Added 1 data stream ( Nodes )
  • Added data collection logic.
  • Added the ingest pipelines.
  • Mapped fields according to the ECS schema and added Fields metadata in the appropriate yml files.
  • Added system test cases.

Checklist

  • I have reviewed tips for building integrations and this pull request is aligned with them.
  • I have verified that all data streams collect metrics or logs.
  • I have added an entry to my package's changelog.yml file.
  • If I'm introducing a new feature, I have modified the Kibana version constraint in my package's manifest.yml file to point to the latest Elastic stack release (e.g. ^7.13.0).

How to test this PR locally

  • Clone integrations repo.
  • Install elastic-package locally.
  • Start elastic stack using elastic-package.
  • Move to integrations/packages/apache_spark directory.
  • Run the following command to run tests.

elastic-package test

@yug-rajani yug-rajani requested a review from a team as a code owner March 30, 2022 15:22
@elasticmachine
Copy link

elasticmachine commented Mar 30, 2022

💚 Build Succeeded

the below badges are clickable and redirect to their specific view in the CI or DOCS
Pipeline View Test View Changes Artifacts preview preview

Expand to view the summary

Build stats

  • Start Time: 2022-04-04T12:58:34.198+0000

  • Duration: 13 min 13 sec

Test stats 🧪

Test Results
Failed 0
Passed 3
Skipped 0
Total 3

🤖 GitHub comments

To re-run your PR in the CI, just comment with:

  • /test : Re-trigger the build.

@yug-rajani
Copy link
Contributor Author

This PR is a split of #2811 as discussed over the comment #2811 (comment)

@yug-rajani yug-rajani self-assigned this Mar 30, 2022
@yug-rajani yug-rajani requested a review from mtojek March 30, 2022 19:19
@yug-rajani yug-rajani added enhancement New feature or request Team:Integrations Label for the Integrations team New Integration Issue or pull request for creating a new integration package. labels Mar 30, 2022
@elasticmachine
Copy link

Pinging @elastic/integrations (Team:Integrations)

Copy link
Contributor

@mtojek mtojek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm leaving Kibana resources for later.

@cla-checker-service
Copy link

cla-checker-service bot commented Apr 4, 2022

💚 CLA has been signed

@yug-rajani yug-rajani force-pushed the package_apache_spark_nodes branch from 69b6944 to 727dcc9 Compare April 4, 2022 10:56
@yug-rajani yug-rajani force-pushed the package_apache_spark_nodes branch from 727dcc9 to 860aeb3 Compare April 4, 2022 12:13
@yug-rajani yug-rajani requested review from mtojek and ruflin April 4, 2022 13:14
Copy link
Contributor

@mtojek mtojek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ship it :)

@yug-rajani yug-rajani linked an issue Apr 26, 2022 that may be closed by this pull request
16 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request New Integration Issue or pull request for creating a new integration package. Team:Integrations Label for the Integrations team
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Create Apache Spark integration
5 participants