cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

db-avengers2rul
by Contributor II
  • 1153 Views
  • 1 replies
  • 1 kudos

Resolved! DBFS Rest Api is disabled

Dear Team,I have created a db account using gcp when i tried to create the token and configure databricks cli and tried to connect i get the below errordatabricks fs lserrorError: b'{"error_code":"FEATURE_DISABLED","message":"DBFS Rest Api is disable...

  • 1153 Views
  • 1 replies
  • 1 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 1 kudos

Hi @Rakesh Reddy Gopidi​ This is a known limitation with DBFS API and GCP. We are planning to redesign the DBFS API and we wanted to not gain more users that we later might need to migrate to a new API. If this is really required for you, please pro...

  • 1 kudos
johnny1
by New Contributor II
  • 1672 Views
  • 2 replies
  • 0 kudos

Why it still complain REST API version is 2.0 even though set it to 2.1 ?

root@387ece6d15b2:/usr/workspace# databricks --versionVersion 0.17.3root@387ece6d15b2:/usr/workspace# databricks jobs configure --version=2.1root@387ece6d15b2:/usr/workspace# databricks jobs get --job-id 123WARN: Your CLI is configured to use Jobs AP...

  • 1672 Views
  • 2 replies
  • 0 kudos
Latest Reply
johnny1
New Contributor II
  • 0 kudos

Command "databricks jobs configure --version=2.1" not work.workaround with adding option "--version=2.1" to each databricks jobs/runs command .It is not very convenient.

  • 0 kudos
1 More Replies
swetha
by New Contributor III
  • 2379 Views
  • 3 replies
  • 4 kudos

Resolved! Retrieving the job-id's of a notebook running inside tasks

I have created a job, Inside a job I have created tasks which are independent, I have used the concept of concurrent futures to exhibit parallelism and in each task there are couple of notebooks running(which are independent) Each notebook running ha...

  • 2379 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @swetha kadiyala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 4 kudos
2 More Replies
Shubhamgoyal
by New Contributor
  • 1715 Views
  • 2 replies
  • 1 kudos

Access Databricks SQL databases using rest API

Hi All,We want to read/write data to Databricks SQL using powerapps. I have been looking for documentation around accessing databases in databricks SQL via rest api.Appreciate your help on this.

  • 1715 Views
  • 2 replies
  • 1 kudos
Latest Reply
byrdman
New Contributor III
  • 1 kudos

With the databricks api you can start a workflow job. build the job to ingest your data into tables.

  • 1 kudos
1 More Replies
Abishek
by Valued Contributor
  • 1028 Views
  • 1 replies
  • 2 kudos

www.databricks.com

How to Migrate Your Data and AI Workloads to Databricks With the AWS Migration Acceleration Programhttps://www.databricks.com/blog/2022/08/19/how-to-migrate-your-data-and-ai-workloads-to-databricks-with-the-aws-migration-acceleration-program.htmlIn t...

  • 1028 Views
  • 1 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Moderator
  • 2 kudos

Thank your for sharing this information @Abishek Subramanian​ 

  • 2 kudos
arthur_wang
by New Contributor
  • 3448 Views
  • 2 replies
  • 1 kudos

How does Task Orchestration compare to Airflow (for Databricks-only jobs)?

One of my clients has been orchestration Databricks notebooks using Airflow + REST API. They're curious about the pros/cons of switching these jobs to Databricks jobs with Task Orchestration.I know there are all sorts of considerations - for example,...

  • 3448 Views
  • 2 replies
  • 1 kudos
Latest Reply
Shourya
New Contributor III
  • 1 kudos

@Kaniz Fatma​ Hello Kaniz, I'm currently working with a major Enterprise Client looking to make the choice between the Airflow vs Databricks for Jobs scheduling. Our Entire code base is in Databricks and we are trying to figure out the complexities t...

  • 1 kudos
1 More Replies
RicksDB
by Contributor II
  • 2415 Views
  • 2 replies
  • 3 kudos

Resolved! Maximum job execution per hour

Hi, what is the maximum number of jobs we can execute in an hour for a given workspace?This page mentions 5000https://docs.microsoft.com/en-us/azure/databricks/data-engineering/jobs/jobsThe number of jobs a workspace can create in an hour is limited ...

  • 2415 Views
  • 2 replies
  • 3 kudos
Latest Reply
Sivaprasad1
Valued Contributor II
  • 3 kudos

Up to 5000 jobs (both normal and ephemeral) may be created per hour in a single workspace

  • 3 kudos
1 More Replies
him
by New Contributor III
  • 1142 Views
  • 1 replies
  • 3 kudos
  • 1142 Views
  • 1 replies
  • 3 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 3 kudos

You can try to refer the example below: https://docs.databricks.com/dev-tools/api/latest/examples.html#upload-a-big-file-into-dbfs

  • 3 kudos
PrebenOlsen
by New Contributor III
  • 1831 Views
  • 1 replies
  • 1 kudos

Resolved! Why does @dlt.table from a table give different results than from a view?

I have some data in silver that I read in as a view using the __apply_changes function on. I create a table based on this, and I then want to create my gold-table, after doing a .groupBy() and .pivot(). The transformations I do in the gold-table aren...

image image
  • 1831 Views
  • 1 replies
  • 1 kudos
Latest Reply
PrebenOlsen
New Contributor III
  • 1 kudos

I have found a temporary solution to solve this. The .pivot("columnName") should automatically grab all the values it can find, but for some reason it does not. I need to specify the values, using.pivot("group_name", "group0", "group1", "group2"...) ...

  • 1 kudos
antoniodavideca
by New Contributor III
  • 2974 Views
  • 5 replies
  • 1 kudos

Resolved! Jobs REST Api - Run a Job that is connected to a git_source

On Jobs REST API is possible to create a new Job, specifying a git_source.My question is about triggering the job.Still on Jobs REST Api is possible to trigger a job using the job_id, but I don't find a way to tell anyhow to Databricks, what's the en...

  • 2974 Views
  • 5 replies
  • 1 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 1 kudos

Ah. Got it. So is your issue resolved or are you looking for further information.

  • 1 kudos
4 More Replies
antoniodavideca
by New Contributor III
  • 2125 Views
  • 2 replies
  • 0 kudos

Jobs REST Api - Create new Job with a new Cluster, and install a Maven Library on the Cluster

I would need to use the Job REST API to create a Job on our databrick Cluster.At the Job Creation, is possible to specify an existing cluster, or, create a new one.I can forward alot of information to the Cluster, but what I would like to specify is ...

  • 2125 Views
  • 2 replies
  • 0 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 0 kudos

@Antonio Davide Cali​ You can use the existing cluster in your json to use it for the job.To update or push libraries to the job, you can use the JobsUpdate API. As you want to push libraries to the cluster, you can push them using the new setting an...

  • 0 kudos
1 More Replies
RS1
by New Contributor III
  • 3731 Views
  • 6 replies
  • 7 kudos

Data & AI Summit 2022 - Training Videos of paid Instructor led sessions not yet uploaded. @kaniz fatma

@Kaniz Fatma​ I attended the Advanced Machine Learning with Databricks training last week virtually I am still unable to get the day 2 session videos of any of the Instructor led Paid Trainings. They are supposed to be available for replay with in 24...

  • 3731 Views
  • 6 replies
  • 7 kudos
Latest Reply
RS1
New Contributor III
  • 7 kudos

Hi @Kaniz Fatma​ , they uploaded the full video for Advanced Machine Learning with Databricks course day 2, Thank you for the follow up. but still we have the same issue with Apache Spark Programming with Databricks - Bundle: Day 2 Training . can you...

  • 7 kudos
5 More Replies
Anonymous
by Not applicable
  • 11323 Views
  • 26 replies
  • 4 kudos

Use Case Sharing Sweepstakes !  Data + AI Summit is in full swing and we know you are just as excited as we are to learn about the new and exciting th...

Use Case Sharing Sweepstakes ! Data + AI Summit is in full swing and we know you are just as excited as we are to learn about the new and exciting things happening at Databricks. From notebooks to the Lakehouse, we know some of these new features wil...

  • 11323 Views
  • 26 replies
  • 4 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 4 kudos

Cloning libraries when cloning clustersCurrently when we clone clusters, the externally added libraries aren't copied as part of cloning process.It's an expected behavior but a missing one. At times new developers end up spending lot of time in debug...

  • 4 kudos
25 More Replies
Jingalls
by New Contributor II
  • 633 Views
  • 1 replies
  • 2 kudos

The Data + AI summit is a blast so far. There are so many new technologies being released such as Delta Lake ​2.0 being open source.

The Data + AI summit is a blast so far. There are so many new technologies being released such as Delta Lake ​2.0 being open source.

  • 633 Views
  • 1 replies
  • 2 kudos
Latest Reply
Zzof
New Contributor II
  • 2 kudos

Agreed! You should check out the Azure booth if you haven't already they have a really cool demo.​

  • 2 kudos
Labels