Hi Everyone,I was wondering if anyone here has any experience or tips reading data from AWS DocumentDB. I am working on this using the MongoDB connector. For DocumentDB we also need to work with the required creds issued as a .pem file by AWS. Th...
Hi @Kaniz Fatma​ ,Thank you so much for your response. Your suggestions were helpful. As per the AWS documentation, DocumentDB is MongoDB compatible. "With Amazon DocumentDB, you can run the same application code and use the same drivers and tools th...
Currently our service provides an API to serve the purchase records. The purchase records are stored in SQL database. To simplify, when users want to get their recent purchase records, they make an API call. The API call will run a SQL query on the D...
Hi @Stanley Tang​​​, We haven’t heard from you since the last response from @Debayan Mukherjee​ , and I was checking back to see if his suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful...
Hi @serg-v (Customer)​​, We haven’t heard from you since the last response from me​ ​, and I was checking back to see if my suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to others.A...
I have 2 very similarly configured workspaces, one in us-west-2 and one in us-east-2.Both got configured by default with a "Starter Warehouse".The one in us-west-2 I can reach via the internet using python databricks-sql-connector, but the one in us-...
Hi @Marcus Simonsen​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...
Consider User A has deployed the job to prod. User B has scheduled the job through an external orchestration tool.User C has got the owner privileges from User A. Whose email id would be displayed while running the databricks job?
Hi @Sandeep Negi​ , We haven’t heard from you since the last response from me​ ​, and I was checking back to see if my suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to others.Also, ...
Hi , Could anyone please help to know the steps for connecting Cassandra from Azure Databricks? I have followed the steps in https://learn.microsoft.com/en-us/azure/databricks/_static/notebooks/azure/cassandra-azure.html But I am getting below error....
Hi @Dinu Sukumara​, We haven’t heard from you on the last response from me, and I was checking back to see if my suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to others.Also, P...
In databricks I am trying to read data from a protected kafka topic using pyspark.I am getting an error "unable to find LoginModule class: org.apache.kafka.common.security.plain.PlainLoginModule".
What is the best practice for logging in Databricks notebooks? I have a bunch of notebooks that run in parallel through a workflow. I would like to keep track of everything that happens such as errors coming from a stream. I would like these logs to ...
Hi @Gimwell Young​ , We haven’t heard from you on the last response from @karthik p​​, @Debayan Mukherjee​ and @Hubert Dudek​ and I was checking back to see if their suggestions helped you. Or else, If you have any solution, please do share that wit...
Hi team, I started getting this message lately when trying add some new config or change my workspace with terraform :Error: cannot create global init script: authentication is not configured for provider. Please check https://registry.terraform.io/p...
Hi @Avi Edri​ looks like you are using a provider that is authenticated to the Accounts console (https://accounts.cloud.databricks.com) to create a global init script within the workspace. Can you try authentication with host and PAT token? Follow th...
I need to move group of files(python or scala file from)or folder from dbfs location to user workspace directory in azure databricks to do testing on file.Its verify difficult to upload each file one by one into the user workspace directory, so is it...
Hi @KARTHICK N​, We haven’t heard from you on the last response from @Werner Stinckens​​, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to...
Guys, I am using "Databricks Community" to study. I put some files in a Blob, granted all access but I have no ideia why DB is not reading. Please see the code below and thanks for helping! thanks!
Hi @Fernando Rezende​, Thank you for sharing the solution with us.It would mean a lot if you could select the "Best Answer" to help others find the correct answer faster.This makes that answer appear right after the question, so it's easier to find w...
I tried to use Spark as much as possible but experience some regression. Hopefully to get some direction how to use it correctly.I've created a Databricks table using spark.sqlspark.sql('select * from example_view ') \
.write \
.mode('overwr...
Hi @Vincent Doe​ ​, It would mean a lot if you could select the "Best Answer" to help others find the correct answer faster.This makes that answer appear right after the question, so it's easier to find within a thread.It also helps us mark the quest...
I have upgraded my expired Student subscription to 'Azure subscription 1' in Azure portal today. I want to use Databricks for personal projects as pay-as-you-go.When I go to my Databricks workspace and to my notebook and try to create a cluster,Comp...
Hi @Maria Bruevich​, We haven’t heard from you since the last response from me​ ​, and I was checking back to see if my suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to others.Also,...
Hi @Rahul Mishra​, We haven’t heard from you since the last response from @Pat Sienkiewicz​ and me ​, and I was checking back to see if our suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be hel...