-
Updated
Sep 9, 2020 - Java
spark-streaming
Here are 625 public repositories matching this topic...
-
Updated
May 26, 2019 - Scala
-
Updated
Apr 1, 2019 - Java
-
Updated
Sep 8, 2020 - Java
-
Updated
Feb 9, 2019 - Scala
-
Updated
Nov 1, 2019 - C#
-
Updated
Aug 21, 2020 - JavaScript
-
Updated
Mar 31, 2018
-
Updated
Sep 9, 2020 - Java
-
Updated
Apr 8, 2018 - Scala
Is your feature request related to a problem? Please describe.
Some areas of the web portal have issues with screen readers. Here are a few examples
Describe the solution you'd like
Improve readability for screen readers across the web portal
-
Updated
Oct 12, 2016 - Scala
Describe the bug
Gimel Logo is not appearing in readthedocs.
http://gimel.readthedocs.io/en/latest/getting-started/learn-data-API-usage/
To Reproduce
Open the link http://gimel.readthedocs.io/en/latest/getting-started/learn-data-API-usage/
-
Updated
Apr 15, 2018 - Scala
-
Updated
Sep 8, 2020 - Java
-
Updated
Feb 1, 2019 - TypeScript
-
Updated
Sep 8, 2020 - Java
-
Updated
Sep 5, 2020 - JavaScript
-
Updated
Sep 1, 2020 - Scala
-
Updated
Dec 21, 2017 - Scala
-
Updated
May 29, 2017 - Scala
I am able to use consume the Kinesis stream using this jar as a normal consumer. When i updated the user account to Enhanced fan out consumer, i am unable to access the stream.
Do we have any way to access the stream as Enhanced fan out consumer?
-
Updated
May 23, 2019 - Java
-
Updated
Aug 31, 2020 - Scala
-
Updated
Jun 25, 2018 - Scala
-
Updated
Aug 7, 2020 - Jupyter Notebook
Improve this page
Add a description, image, and links to the spark-streaming topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the spark-streaming topic, visit your repo's landing page and select "manage topics."
The current azure-pipelines.yaml is highly duplicated, especially the Test stages (E2E Tests, E2E Backward Compatibility Tests, and E2E Forward Compatibility Tests).
This should be refactored to remove duplication to make it easy to maintain (e.g, adding a new Spark version to test against).