- 33387 Views
- 3 replies
- 3 kudos
Given Copilot has now been released as a paid for product. Do we have a timeline when it will be integrated into Databricks?Our team are using VScode alot for Copilot and we think it would be super awesome to have it on our Databricks environment. Ou...
- 33387 Views
- 3 replies
- 3 kudos
Latest Reply
@Vartika no josephk didn't answer Aidan's question. It's about comparing copilot with databricks assistant and can copilot be used in databricks workspace?
2 More Replies
by
AnuVat
• New Contributor III
- 41986 Views
- 7 replies
- 13 kudos
Hi, I am working on an ML project and I need to access the data in tables hosted in my Databricks cluster through a notebook that I am running locally. This has been very easy while I run the notebooks in Databricks but I cannot figure out how to do ...
- 41986 Views
- 7 replies
- 13 kudos
Latest Reply
We can use Apis and pyodbc to achieve this. Once go through the official documentation of databricks that might be helpful to access outside of the databricks environment.
6 More Replies
- 13379 Views
- 2 replies
- 2 kudos
I run some jobs in the Databricks environment where some resources need authentication. I do this (and I need to) through the vault-cli in the init-script.However, every time in the init-script I need to install vault-cli and other libraries. Is ther...
- 13379 Views
- 2 replies
- 2 kudos
Latest Reply
@João Victor Albuquerque :Yes, there are a few ways to pre-install libraries and tools in the Databricks environment:Cluster-scoped init scripts: You can specify a shell script to be run when a cluster is created or restarted. This script can includ...
1 More Replies
- 2796 Views
- 2 replies
- 0 kudos
I am looking for a simple way to have a structured streaming pipeline that would automatically register a schema to Azure schema registry when converting a df col into avro and that would be able to deserialize an avro col based on schema registry ur...
- 2796 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Tomas Sedlon Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...
1 More Replies
- 2968 Views
- 2 replies
- 0 kudos
I want to make a connection to Databricks with KNIME. To do this I am using "Create Databricks Environment" node. I have made the following configuration:1. Installed Databricks Simba JDBC driver 2. Made the necessary configuration in Create Databric...
- 2968 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Geethanjali Nataraj Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell...
1 More Replies
by
haylee
• New Contributor II
- 2798 Views
- 4 replies
- 0 kudos
Commands Attempted:dbutils.secrets.listScopes()dbutils.secrets.get(scope = "{InsertScope}", key = "{InsertKey}") Error: "shaded.v245.com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, ...
- 2798 Views
- 4 replies
- 0 kudos
Latest Reply
Hi @Haylee Gaddy,Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.
3 More Replies
- 2320 Views
- 2 replies
- 4 kudos
I'm stuck at a point where I can't automatically set up everything about a databricks environment due to the fact that service principals can't be made an admin at the account level (accounts.azuredatabricks.net, similar for aws). Going into a bare t...
- 2320 Views
- 2 replies
- 4 kudos
Latest Reply
Unfortunately there are still some limitations with doing IaC on Databricks with Terraform (ex. another one is that you can't setup KeyVault as a secret store with Service Principal).I think that instead of doing stuff manually, you can authenticate ...
1 More Replies
- 2926 Views
- 3 replies
- 7 kudos
I'm trying to figure out how a custom appender should be configured in a Databricks environment but I cannot figure it out.When cluster is running, in `driver logs`, time is displayed as 'unknown' for my custom log file and when cluster is stopped, c...
- 2926 Views
- 3 replies
- 7 kudos
Latest Reply
We're having the same problem with 11.3 LTS. Are there any updates? We would like to deliver log4j messages from Databricks Notebooks to custom log files and then upload those to S3 or DBFS. Best
2 More Replies
- 4847 Views
- 2 replies
- 2 kudos
Hi, When I was trying to read the CSV files using pandas I am getting an error which I have mentioned below.df=pd.read_csv("/dbfs/FileStore/tables/badrecord-1.csv")Error: FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/tables...
- 4847 Views
- 2 replies
- 2 kudos
Latest Reply
dbutils.fs.ls("/FileStore/tables/badrecord-1.csv")the above file is there in that particular location but still getting the same error
1 More Replies
- 867 Views
- 0 replies
- 0 kudos
This is a 2-part question:How do I go about encrypting an existing root S3 bucket?Will this impact my Databricks environment? (Resources not being accessible, performance issues etc.)
- 867 Views
- 0 replies
- 0 kudos
- 2209 Views
- 2 replies
- 0 kudos
Hi I am planning to run Koalas on Databricks environment, What are the requirements for running Koalas there
- 2209 Views
- 2 replies
- 0 kudos
Latest Reply
Koalas is great! This really helps ease the transition from Pandas to Spark, because you can just use the same Pandas functions/classes through the Koalas API but everything runs in the background in Spark.
1 More Replies
- 2378 Views
- 1 replies
- 0 kudos
By default any user can download a copy of the data they query in a notebook. Is it possible to prevent this?
- 2378 Views
- 1 replies
- 0 kudos
Latest Reply
You can limit the ways that users can save copies of the data they have access to in a notebook, but not prevent it entirely. The download button which exists for cells in Databricks notebooks can be disabled in the "Workspace Settings" section of th...