Connecting to SAP ECC
I'm looking to meet with others who have successfully connected databricks to SAP ECC.
- 435 Views
- 0 replies
- 0 kudos
I'm looking to meet with others who have successfully connected databricks to SAP ECC.
Hello,I'm experiencing difficulty logging into the Databricks community despite using the correct username and password. Additionally, when attempting to reset my password, I haven't received any email notifications.
Add Vancouver group here
Enjoying at the Databricks Summit
Would be great to hear about any challenges, solutions, or tips. Thank you!
Supposedly there are 4 major types of cluster in Datbricks that are- General Purpose, Storage Optimized, Memory Optimized and Compute Optimized Clusters but I'm not able to find detailed information as on which cluster to choose specifically in which...
What is the best method to expose Azure Databricks metrics to Prometheus specifically? And is it possible to get the underlying Spark metrics also? All I can see clearly defined in the documentation is the serving endpoint metrics:https://learn.micro...
HelloI don't have databricks running as pod in an aks cluster.. It's working on azure as saas.. What should I do the export the metrics to prometheus?
What is the Best way for implement streaming data flow dron kafka to databricks (delta tables)
Structured streaming:https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html
Hi There!I am to trying write a batch data to kafka topic with schema registry in databricks using pyspark, i serialize the data with pyspark to_avro function and write it to the topic, but the consumers can’t read the schema id. If they do not separ...
To identify the reasons for a data process poor performance, we need to navigate and analyze the metrics in the Spark UI manually... However, replicating those steps for a giant group of spark applications would be very expensive in times...Given thi...
Check out our platform architecture italgas-from-gas-pipelines-to-data-pipelines-fueling-our-reporting-with-the-latest-innovations-7f00e20ba115?source=social.linkedin
Hi all,I am calling get job run list API to get all task ids and refer them in dbt-artifacts view created by dbt job run. The question is I can see 'task run id' on screen but it doesn't come back in api response? Is there a way to get it? I checked ...
Never mind, I have found task_run_id present in getrun api https://docs.databricks.com/api/azure/workspace/jobs/getrunI overlooked at first instance as it is buried under nested json structuretasks[] > run_id.This clarifies and solves my problem!
Which is better for meta data handling ?
Hi, I still have some questions, I have a Databricks on AWS and I need to mount S3 bucksts.According to the documentation, it is recommended to do it through the Unity Catalog, but how would I go about reading data from a notebook that would be mount...
Returning, I already understood, I'm marking it as resolved.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up NowUser | Count |
---|---|
1613 | |
771 | |
349 | |
286 | |
253 |