cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kara
by New Contributor II
  • 1028 Views
  • 0 replies
  • 1 kudos

Integrating Databricks repos with Azure DevOps

Hi, Databricks community. I am trying to integrate Databricks shared folder notebooks with Azure DevOps GIT repositories. Can someone please point me to a basic training tutorial (or video) on how to get started and best practices?

  • 1028 Views
  • 0 replies
  • 1 kudos
yopbibo
by Contributor II
  • 2732 Views
  • 2 replies
  • 0 kudos

Resolved! Cluster configuration / notebook panel

Hi,Is it possible to let regular users to see all running notebooks (in the notebook panel of the cluster) on a specific cluster they can use (attach and restart).by default admins can see all running notebooks and users can see only their own notebo...

  • 2732 Views
  • 2 replies
  • 0 kudos
Latest Reply
Prabakar
Databricks Employee
  • 0 kudos

hi @Philippe CRAVE​ a user can see a notebook only if they have permission to that notebook. Else they won't be able to see it. Unfortunately there is no possibility for a normal user to see the notebooks attached to a cluster if they do not have per...

  • 0 kudos
1 More Replies
akshay1
by New Contributor II
  • 2243 Views
  • 0 replies
  • 2 kudos

Data unloading to S3 bucket from Databricks.

Hi,I am completely new to the Databricks & have a task to unload the data from Databricks table to the S3 location using java/sql. Is this possible? If yes can you please help me?

  • 2243 Views
  • 0 replies
  • 2 kudos
User16790091296
by Contributor II
  • 5435 Views
  • 1 replies
  • 2 kudos

How to restart a cluster on databricks using databricks-CLI?

I'm trying to restart an existing cluster in Databricks on Azure using databricks-cli.I'm using the following command:databricks clusters restart {"cluster_id": "0710-121255-liner30"}But it gives giving me this error:Error: Missing option "--cluster-...

  • 5435 Views
  • 1 replies
  • 2 kudos
Latest Reply
User16766737456
Databricks Employee
  • 2 kudos

Can you try:databricks clusters restart --cluster-id <the-cluster-id>$ databricks clusters restart --help Usage: databricks clusters restart [OPTIONS]   Restarts a Databricks cluster given its ID.   If the cluster is not currently in a RUNNING st...

  • 2 kudos
huggies_23
by New Contributor
  • 1231 Views
  • 0 replies
  • 0 kudos

Is it possible to specify a specific branch commit when deploying repo to a workspace via the Databricks CLI?

I would like to know if it is possible to include a specific commit identifier when updating a repo in a workspace via the Databricks CLI.Why? Currently we use the repos CLI to push updates to code throughout dev, test and prod (testing along the wa...

  • 1231 Views
  • 0 replies
  • 0 kudos
Taha_Hussain
by Databricks Employee
  • 14894 Views
  • 2 replies
  • 6 kudos

Resolved! Create a Dashboard: How do I visualize data with Databricks SQL or my BI tool?

Databricks SQL helps query and visualize data so you can share real-time business insights with built-in dashboards or your favorite BI tools.This post helps you create queries, visualizations and dashboards and connect to your BI tools for deeper da...

Databricks SQL Locked DBSQL Create A Query Data Explorer
  • 14894 Views
  • 2 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Thanks for the information, I will try to figure it out for more. Keep sharing such informative post keep suggesting such post.

  • 6 kudos
1 More Replies
Taha_Hussain
by Databricks Employee
  • 1307 Views
  • 0 replies
  • 3 kudos

Register for Databricks Office HoursAugust 17 & August 31 from 8:00am - 9:00am PT | 3:00pm - 4:00pm GMT. Databricks Office Hours connects you dire...

Register for Databricks Office HoursAugust 17 & August 31 from 8:00am - 9:00am PT | 3:00pm - 4:00pm GMT.Databricks Office Hours connects you directly with experts to answer your Databricks questions.Join us to: • Troubleshoot your technical questions...

  • 1307 Views
  • 0 replies
  • 3 kudos
Dua14
by New Contributor
  • 1768 Views
  • 2 replies
  • 1 kudos

Databricks and AWS Cloud watch agent issue

I'm facing problem while connecting Data bricks with AWS cloud watch, I want to send certain logs to cloud watch but seems like there is some connectivity issue between the 2 parties

  • 1768 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Tushar Dua​ , please follow the below blog which has details on how to monitor Databricks using Cloudwatch.How to Monitor Databricks with AWS CloudWatch

  • 1 kudos
1 More Replies
RonVBrown
by New Contributor
  • 4657 Views
  • 3 replies
  • 3 kudos
  • 4657 Views
  • 3 replies
  • 3 kudos
Latest Reply
Sivaprasad1
Valued Contributor II
  • 3 kudos

@RonVBrown (Customer)​ : Could you please refer below linkhttps://docs.databricks.com/data/data-sources/elasticsearch.htmlPlease try to use opens search library instead of the ES jar if it does not work.https://search.maven.org/artifact/org.opensearc...

  • 3 kudos
2 More Replies
118004
by New Contributor II
  • 791 Views
  • 0 replies
  • 0 kudos

Use databricks-sync import to migrate to new workspace

Hello,We are using the databricks-sync tool in an attempt to migrate from a legacy workspace into a new E2 account workspace. The tool exports json files successfully, but when I try to import, I receive various Terraform errors referencing undeclar...

  • 791 Views
  • 0 replies
  • 0 kudos
jgrgn
by New Contributor
  • 1256 Views
  • 0 replies
  • 0 kudos

define notebook path from a parameter

Is there a way to define the notebook path based a parameter from the calling notebook using %run? I am aware of dbutils.notebook.run(), but would like to have all the functions defined in the reference notebook to be available in the calling noteboo...

  • 1256 Views
  • 0 replies
  • 0 kudos
BradSheridan
by Valued Contributor
  • 2422 Views
  • 0 replies
  • 0 kudos

Workflow parameters

Hey everyone! I'm close but can't seem to figure this out. I'm trying to add 2 notebooks to a Databricks Job. Instead of the first command in both notebooks being a connection to an RDS/Redshift cluster, I'd prefer to make that connection once and ha...

  • 2422 Views
  • 0 replies
  • 0 kudos
palzor
by New Contributor III
  • 1080 Views
  • 0 replies
  • 2 kudos

What is the best practice while loading delta table , do I infer the schema or provide the schema?

I am loading avro files into the detla tables. I am doing this for multiple tables and some files are big like (2-3GB) and most of them are small like in few MBs.I am using autoloader to load the data into the delta tables.My question is:What is the ...

  • 1080 Views
  • 0 replies
  • 2 kudos
anisha_93
by New Contributor II
  • 5133 Views
  • 2 replies
  • 1 kudos

Error in SQL statement: KeyProviderException: Failure to initialize configuration

I have a source delta table from which I have selectively granted access to a particular pool id(can be thought of a dummy user). From the pool id interface, whenever I am running a select on any of the tables, even though it has access to, is faili...

  • 5133 Views
  • 2 replies
  • 1 kudos
Latest Reply
alicewong20
New Contributor II
  • 1 kudos

Hello all,I got the same problem. Does anyone help?

  • 1 kudos
1 More Replies
Dicer
by Valued Contributor
  • 4634 Views
  • 4 replies
  • 3 kudos

Resolved! Azure Databricks: Failed to extract data which is between two timestamps within those same dates using Pyspark

Data type:AAPL_Time: timestampAAPL_Close: floatRaw Data:AAPL_Time AAPL_Close 2015-05-11T08:00:00.000+0000 29.0344 2015-05-11T08:30:00.000+0000 29.0187 2015-05-11T09:00:00.000+0000 29.0346 2015-05-11T09:3...

  • 4634 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Another thing to try is the hour() and minute() functions will return integers.

  • 3 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels