cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Aidonis
by New Contributor III
  • 6846 Views
  • 3 replies
  • 3 kudos

Copilot Databricks integration

Given Copilot has now been released as a paid for product. Do we have a timeline when it will be integrated into Databricks?Our team are using VScode alot for Copilot and we think it would be super awesome to have it on our Databricks environment. Ou...

  • 6846 Views
  • 3 replies
  • 3 kudos
Latest Reply
prasad_vaze
New Contributor III
  • 3 kudos

@Vartika no josephk didn't answer Aidan's question.  It's about comparing copilot with databricks assistant  and can copilot be used in databricks workspace?

  • 3 kudos
2 More Replies
AnuVat
by New Contributor III
  • 15682 Views
  • 7 replies
  • 12 kudos

How to read data from a table into a dataframe outside of Databricks environment?

Hi, I am working on an ML project and I need to access the data in tables hosted in my Databricks cluster through a notebook that I am running locally. This has been very easy while I run the notebooks in Databricks but I cannot figure out how to do ...

  • 15682 Views
  • 7 replies
  • 12 kudos
Latest Reply
chakri
New Contributor III
  • 12 kudos

We can use Apis and pyodbc to achieve this. Once go through the official documentation of databricks that might be helpful to access outside of the databricks environment.

  • 12 kudos
6 More Replies
joao_albuquerqu
by New Contributor II
  • 1427 Views
  • 2 replies
  • 1 kudos

Is it possible to have Cluster with pre-installed dependencies?

I run some jobs in the Databricks environment where some resources need authentication. I do this (and I need to) through the vault-cli in the init-script.However, every time in the init-script I need to install vault-cli and other libraries. Is ther...

  • 1427 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@João Victor Albuquerque​ :Yes, there are a few ways to pre-install libraries and tools in the Databricks environment:Cluster-scoped init scripts: You can specify a shell script to be run when a cluster is created or restarted. This script can includ...

  • 1 kudos
1 More Replies
scalasparkdev
by New Contributor
  • 1377 Views
  • 2 replies
  • 0 kudos

Pyspark Structured Streaming Avro integration to Azure Schema Registry with Kafka/Eventhub in Databricks environment.

I am looking for a simple way to have a structured streaming pipeline that would automatically register a schema to Azure schema registry when converting a df col into avro and that would be able to deserialize an avro col based on schema registry ur...

  • 1377 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Tomas Sedlon​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...

  • 0 kudos
1 More Replies
Geetha
by New Contributor
  • 1549 Views
  • 2 replies
  • 0 kudos

Connection to Databricks through KNIME

I want to make a connection to Databricks with KNIME. To do this I am using "Create Databricks Environment" node. I have made the following configuration:1. Installed Databricks Simba JDBC driver 2. Made the necessary configuration in Create Databric...

  • 1549 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Geethanjali Nataraj​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell...

  • 0 kudos
1 More Replies
haylee
by New Contributor II
  • 1086 Views
  • 4 replies
  • 0 kudos

I added a secret scope to the databricks environment, and I get this error when trying to run either of the following:

Commands Attempted:dbutils.secrets.listScopes()dbutils.secrets.get(scope = "{InsertScope}", key = "{InsertKey}") Error: "shaded.v245.com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, ...

  • 1086 Views
  • 4 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Hi @Haylee Gaddy​,Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

  • 0 kudos
3 More Replies
Jfoxyyc
by Valued Contributor
  • 805 Views
  • 2 replies
  • 4 kudos

Databricks Terraform - how to manage databricks entirely through Terraform?

I'm stuck at a point where I can't automatically set up everything about a databricks environment due to the fact that service principals can't be made an admin at the account level (accounts.azuredatabricks.net, similar for aws). Going into a bare t...

  • 805 Views
  • 2 replies
  • 4 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 4 kudos

Unfortunately there are still some limitations with doing IaC on Databricks with Terraform (ex. another one is that you can't setup KeyVault as a secret store with Service Principal).I think that instead of doing stuff manually, you can authenticate ...

  • 4 kudos
1 More Replies
cchiulan
by New Contributor III
  • 1361 Views
  • 3 replies
  • 7 kudos

Databricks Log4J Custom Appender Not Working as expected

I'm trying to figure out how a custom appender should be configured in a Databricks environment but I cannot figure it out.When cluster is running, in `driver logs`, time is displayed as 'unknown' for my custom log file and when cluster is stopped, c...

  • 1361 Views
  • 3 replies
  • 7 kudos
Latest Reply
Wolf
New Contributor II
  • 7 kudos

We're having the same problem with 11.3 LTS. Are there any updates? We would like to deliver log4j messages from Databricks Notebooks to custom log files and then upload those to S3 or DBFS. Best

  • 7 kudos
2 More Replies
tinendra
by New Contributor III
  • 3645 Views
  • 3 replies
  • 3 kudos

How to read a file in pandas in a databricks environment?

Hi, When I was trying to read the CSV files using pandas I am getting an error which I have mentioned below.df=pd.read_csv("/dbfs/FileStore/tables/badrecord-1.csv")Error: FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/tables...

  • 3645 Views
  • 3 replies
  • 3 kudos
Latest Reply
tinendra
New Contributor III
  • 3 kudos

dbutils.fs.ls("/FileStore/tables/badrecord-1.csv")the above file is there in that particular location but still getting the same error

  • 3 kudos
2 More Replies
Kaniz
by Community Manager
  • 511 Views
  • 2 replies
  • 1 kudos

What are the best practices for the isolation of different environments in Databricks? I am trying to find out the best practice around Databricks env...

What are the best practices for the isolation of different environments in Databricks?I am trying to find out the best practice around Databricks environments creation like - dev, stage, and prod. Should it be:1. Single Databricks account with multip...

  • 511 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hey @Abhijit Rai​ Does @Atanu Sarkar​'s response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?Thank you!

  • 1 kudos
1 More Replies
User16790091296
by Contributor II
  • 619 Views
  • 1 replies
  • 0 kudos
  • 619 Views
  • 1 replies
  • 0 kudos
Latest Reply
aladda
Honored Contributor II
  • 0 kudos

Depends on what you're looking for from a management perspective, but one option is the Account API which allows deploying/updating/configuring multiple workspaces in a given E2 accountUse this API to programmatically deploy, update, and delete works...

  • 0 kudos
User16826990884
by New Contributor III
  • 428 Views
  • 0 replies
  • 0 kudos

Encrypt root S3 bucket

This is a 2-part question:How do I go about encrypting an existing root S3 bucket?Will this impact my Databricks environment? (Resources not being accessible, performance issues etc.)

  • 428 Views
  • 0 replies
  • 0 kudos
User16826994223
by Honored Contributor III
  • 977 Views
  • 2 replies
  • 0 kudos

Requirement to Run Koalas

Hi I am planning to run Koalas on Databricks environment, What are the requirements for running Koalas there

  • 977 Views
  • 2 replies
  • 0 kudos
Latest Reply
User16776431030
New Contributor III
  • 0 kudos

Koalas is great! This really helps ease the transition from Pandas to Spark, because you can just use the same Pandas functions/classes through the Koalas API but everything runs in the background in Spark.

  • 0 kudos
1 More Replies
User16826992666
by Valued Contributor
  • 1350 Views
  • 1 replies
  • 0 kudos

Can I prevent users from downloading data from a notebook?

By default any user can download a copy of the data they query in a notebook. Is it possible to prevent this?

  • 1350 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826992666
Valued Contributor
  • 0 kudos

You can limit the ways that users can save copies of the data they have access to in a notebook, but not prevent it entirely. The download button which exists for cells in Databricks notebooks can be disabled in the "Workspace Settings" section of th...

  • 0 kudos
Labels