by
kashy
• New Contributor III
- 1540 Views
- 6 replies
- 1 kudos
Hello, How can I configure a foreign catalog connection to use SSH tunneling? I want to be able to use unity catalog
- 1540 Views
- 6 replies
- 1 kudos
Latest Reply
Hi, in addition to our previous message, you can try https://docs.databricks.com/en/query-federation/foreign-catalogs.html and https://grant-6562.medium.com/connecting-to-sql-server-through-an-ssh-tunnel-with-python-17de859caca5.
Also please tag @Deb...
5 More Replies
- 1074 Views
- 2 replies
- 1 kudos
Hi there,I am having issue with writing a df to a table or display it. I have three dataframes that I have unioned and after I have done the union I cannot display the dataframe.df_table1 = spark.sql(f'SELECT * FROM {sql_full_name}')df_table2 = ...df...
- 1074 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Kristjan , Based on the given information and sources, the issue with writing the dataframe to a table or displaying it seems to be related to the configuration of the storage account.
The error message "Invalid configuration value detected for ...
1 More Replies
- 1667 Views
- 2 replies
- 0 kudos
Hello,I successfully installed the extension and connected it to my databricks account. But when I try to select the repo (which already exists under repos in my databricks repo account) for syncing , I don't see it.I use Azure Devops (Git repo) as s...
- 1667 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Mihai_Cog ,
Based on the given information, it seems that the user has successfully installed the Databricks extension and connected it to their Databricks account.
However, when they try to select a repo for syncing, they are unable to see i...
1 More Replies
- 513 Views
- 1 replies
- 1 kudos
I am planning to add team names in custom tags and was hoping can do it with allowList and then have the user choose from the list. I am trying to avoid having multiple policy files per team.Has anybody found a good way to do this? May be using globa...
- 513 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @AnirbanDas , To add team names as custom tags in Delta Live Tables pipelines, you can follow these steps:
1. Click on the "Workflows" option in the left sidebar menu.2. Select "Delta Live Tables" from the options.3. Click on "Create Pipeline" to ...
- 17911 Views
- 1 replies
- 0 kudos
To import an Excel file into Databricks, you can follow these general steps:1. **Upload the Excel File**:- Go to the Databricks workspace or cluster where you want to work.- Navigate to the location where you want to upload the Excel file.- Click on ...
- 17911 Views
- 1 replies
- 0 kudos
Latest Reply
HI @OnerFusion-AI ,
Your detailed guide on importing Excel files into Databricks is incredibly informative!
The step-by-step instructions and Python code example using Spark showcase a robust approach for handling Excel data within the Databricks e...
- 642 Views
- 1 replies
- 0 kudos
Hello,Is it possible to utilize S3 tags when writing a DataFrame with PySpark? Or is the only option to write the dataframe and then use boto3 to tag all the files?More information about S3 object tagging is here: Amazon S3 Object Tagging.Thank you.
- 642 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @andresalvati ,
The typical approach is to write the DataFrame and then use the AWS SDK, such as boto3 for Python, to set the S3 object tags on the files individually after they have been written.
Here's a general outline of how you co...
- 428 Views
- 1 replies
- 1 kudos
Do I need to save the data locally and run the plotting locally as well or does anyone have a smart solution to this
- 428 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @jamescw ,
- Databricks provides various ways to save and display charts generated with Plotly.- One way is to save the chart as a JPG or PNG file on the driver node and display it in a notebook using the displayHTML() method.- To do this, the Pl...
- 596 Views
- 2 replies
- 0 kudos
Hi, Currently, we have two different AWS accounts: dev and prod. We also have two different workspaces: one for dev and another for prod. The strange thing is that prod costs are being added to the dev account on AWS Cost Explorer ("Databricks Lakeho...
- 596 Views
- 2 replies
- 0 kudos
Latest Reply
Databricks uses tags and AWS CloudTrail logs to connect and report costs to AWS.Tags can be used to monitor costs and attribute Databricks usage to different business units and teams.AWS CloudTrail logs can be used to calculate the exact cost of API ...
1 More Replies
by
Skv
• New Contributor II
- 623 Views
- 2 replies
- 0 kudos
I am trying to fetch filter data based on date format on a date column. Below is the query forming in Databricks. In Databricks giving empty results.SELECT * FROM Test.TestSchema.Address where TO_DATE(TO_VARCHAR(MODIFIEDDATE,'YYYY-MM-DD')) = '2023-09...
- 623 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Skv, The issue you're experiencing could be using uppercase’ YYYY-MM-DD’ in your date format.
In Spark 3.0 and above, which includes current versions of Databricks Runtime, the DateTimeFormatter doesn't recognize’ YYYY-MM-DD’ as a valid date pat...
1 More Replies
- 359 Views
- 1 replies
- 0 kudos
How do you set permission to just execute a query but cannot modify settings on the sql warehouse
- 359 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, Have you tried this: https://docs.databricks.com/en/security/auth-authz/access-control/sql-endpoint-acl.html
Please let us know if this helps.
Thanks!
by
botn
• New Contributor
- 511 Views
- 1 replies
- 0 kudos
Using SAS keys is a security issue that we would like to avoid, how do we utilize structured streaming from Event Hub while authenticating to Azure AD (client_id and secret).We know that we can use pythons Event Hub library, but that will make have t...
- 511 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, Could you please try structured streaming event hubs integration? https://docs.databricks.com/en/_extras/notebooks/source/structured-streaming-event-hubs-integration.html
- 546 Views
- 1 replies
- 0 kudos
Hi Team,Am new to databricks and please find my below question and do the favor.I have created a cluster then database, tables and data also inserted, Now I want to access this table data from my dotnet console application which is in 6.0 framework(v...
- 546 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, Databricks extension will help you to do so. https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext
Please let us know if this helps. Thanks!
by
AniP
• New Contributor II
- 905 Views
- 3 replies
- 3 kudos
Hi,I created the workspace using QuickStart(recommended) method and at the time of creating workspace, it asked following parameters - AccountId -AWSRegionBucketNameDataS3BucketIAMRole.PasswordUsernameWorkspaceNameThe workspace was created successf...
- 905 Views
- 3 replies
- 3 kudos
Latest Reply
Hi, could you please elaborate on the error code here? There is some misconfiguration which is causing the error. Thanks!
2 More Replies
by
none
• New Contributor
- 981 Views
- 3 replies
- 0 kudos
I'm trying to get going with DataBricks for the first time. It's told me to create a workspace, which takes me to AWS (I'm also new to AWS). Following the instructions through there gets it to start creating something, but then it just gets stuck on ...
- 981 Views
- 3 replies
- 0 kudos
Latest Reply
Hi, this looks like a workspace creation failure. Would like to know about the error details more. Thanks!
2 More Replies
by
APKS
• New Contributor
- 362 Views
- 1 replies
- 0 kudos
Hi,I am quite new to working with Databricks in VS code. I am trying to figure out the best way to plot my data, when running on a cluster. I would like to have the possibility to zoom and move the plot as I have when plotting locally with Matplotlib...
- 362 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @APKS, To plot your data when running on a Databricks cluster, you can use the %matplotlib inline magic command in a notebook cell. This will allow you to use Matplotlib to create plots and display them inline in the notebook.