cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Ismail1
by New Contributor III
  • 2360 Views
  • 2 replies
  • 3 kudos

Resolved! API Authentication

I am trying to run some API calls to the account console. I tried with every syntax possible encoded and decoded to get the call successfully but it returns a "user not authenticated" error. But when I tried with the Account admin it worked. I need t...

  • 2360 Views
  • 2 replies
  • 3 kudos
Latest Reply
Ismail1
New Contributor III
  • 3 kudos

Hi Venkat, that sounds like a good idea. Thanks

  • 3 kudos
1 More Replies
Karthikeyan1
by New Contributor II
  • 1053 Views
  • 1 replies
  • 2 kudos

How to get the Databricks Default Pricing through API or any endpoints (*if available)???

Attaching Screenshots FYI from the official site, I've checked the Inspect, but no API calls have been made specifically for this cost default scrapping, Is there any Endpoints available to scrape this?

image.png
  • 1053 Views
  • 1 replies
  • 2 kudos
Latest Reply
Karthikeyan1
New Contributor II
  • 2 kudos

@All Users Group​ Any views on this??

  • 2 kudos
HamzaJosh
by New Contributor II
  • 12845 Views
  • 6 replies
  • 3 kudos

I want to use databricks workers to run a function in parallel on the worker nodes

I have a function making api calls. I want to run this function in parallel so I can use the workers in databricks clusters to run it in parallel. I have tried with ThreadPoolExecutor() as executor: results = executor.map(getspeeddata, alist)to run m...

  • 12845 Views
  • 6 replies
  • 3 kudos
Latest Reply
HamzaJosh
New Contributor II
  • 3 kudos

You guys are not getting the point, I am making API calls in a function and want to store the results in a dataframe. I want multiple processes to run this task in parallel. How do I create a UDF and use it in a dataframe when the task is calling an ...

  • 3 kudos
5 More Replies
User16826994223
by Honored Contributor III
  • 993 Views
  • 1 replies
  • 0 kudos
  • 993 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 0 kudos

Design to honor API and other limits of the platform.• Max API calls/ hr = 1500• Jobs per hour per workspace = 1000• Maximum concurrent Notebooks per cluster = 145

  • 0 kudos
Labels