cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

horatiug
by New Contributor III
  • 3229 Views
  • 4 replies
  • 1 kudos

Databricks workspace with custom VPC using terraform in Google Cloud

I am working on Google Cloud and want to create Databricks workspace with custom VPC using terraform. Is that supported ? If yes is it similar to AWS way ?Thank youHoratiu

  • 3229 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @horatiu guja​ GCP Workspace provisioning using Terraform is public preview now. Please refer to the below doc for the steps.https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/gcp-workspace

  • 1 kudos
3 More Replies
galop12
by New Contributor
  • 4373 Views
  • 3 replies
  • 0 kudos

Databricks workspace (with managed VNET) upgrade to premium failing

I am trying to upgrade our Databricks workspace from standard to premium but running into issues. The workspace is currently deployed in a managed VNET.I tried the migration tool as well as just re-creating a premium workspace with the same parameter...

  • 4373 Views
  • 3 replies
  • 0 kudos
Latest Reply
lskw
New Contributor II
  • 0 kudos

Hi, I have same situation when trying to upgrade from Standard to Premium on Azure.My error: "ConflictWithNetworkIntentPolicy","message":"Found conflicts with NetworkIntentPolicy. Details: Subnet or Virtual Network cannot have resources or properties...

  • 0 kudos
2 More Replies
Manimkm08
by New Contributor III
  • 3017 Views
  • 3 replies
  • 0 kudos

Jobs are failed with AWS_INSUFFICIENT_FREE_ADDRESSES_IN_SUBNET_FAILURE

We have assigned 3 dedicated subnets (one per AZ ) to the Databricks workspace each with /24 CIDR but noticed that all the jobs are running into a single subnet which causes AWS_INSUFFICIENT_FREE_ADDRESSES_IN_SUBNET_FAILURE.Is there a way to segregat...

  • 3017 Views
  • 3 replies
  • 0 kudos
Latest Reply
Manimkm08
New Contributor III
  • 0 kudos

@karthik p​ Have configured one subnet per AZ(total 3). Have followed the same steps as mentioned in the document. Is there a way to check whether the Databricks uses all the subnets or not?@Debayan Mukherjee​ am not getting how to use LB in this set...

  • 0 kudos
2 More Replies
NavyaD
by New Contributor III
  • 2626 Views
  • 2 replies
  • 4 kudos

How to read a sql notebook in python notebook on workspace

I have a notebook named ecom_sellout.sql under the path notebooks/python/dataloader/queries.I have another notebook(named dataloader under the path notebooks/python/dataloader) in which I am calling this sql notebook.My code runs perfectly fine on re...

image
  • 2626 Views
  • 2 replies
  • 4 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 4 kudos

use magic commands and other hand you can use python and SQL formatted there. It will work

  • 4 kudos
1 More Replies
labtech
by Valued Contributor II
  • 2100 Views
  • 3 replies
  • 20 kudos

Resolved! Create Databricks Workspace with different email address on Azure

Hi team,I wonder if we can create a Databricks Workspace that not releated with Azure email address.Thanks

  • 2100 Views
  • 3 replies
  • 20 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 20 kudos

yes , i have done this multiple time

  • 20 kudos
2 More Replies
Thanapat_S
by Contributor
  • 4013 Views
  • 3 replies
  • 5 kudos

Resolved! How could I export an Alert object for deployment to another Azure Databricks resource?

IntroductionI would like to use Alert feature for monitor job status (from log table) in Databricks-SQL.So, I have write a query in a query notebook (or object) to return result from log table. Also, I have set the alert object for monitoring and tri...

image image image
  • 4013 Views
  • 3 replies
  • 5 kudos
Latest Reply
Harun
Honored Contributor
  • 5 kudos

I am not seeing any direct option to export or version control the alert object other than the migrate option.https://docs.databricks.com/sql/api/queries-dashboards.html - check this link, it might help you in other way.

  • 5 kudos
2 More Replies
Himanshi
by New Contributor III
  • 1869 Views
  • 1 replies
  • 6 kudos

How to exclude the existing files when we need to move the streaming job from one databricks workspace to another databricks workspace that may not be compatible with the existing checkpoint state to resume the stream processing?

We do not want to process all the old files, we only wanted to process latest files. whenever we use the new checkpoint path in another databricks workspace, streaming job is processing all the old files as well. Without autoloader feature, is there ...

  • 1869 Views
  • 1 replies
  • 6 kudos
Latest Reply
Shalabh007
Honored Contributor
  • 6 kudos

@Himanshi Patle​ in spark streaming there is one option maxFileAge using which you can control which files to process based on their timestamp.

  • 6 kudos
User16844487905
by New Contributor III
  • 4640 Views
  • 4 replies
  • 5 kudos

AWS quickstart - Cloudformation failure When deploying your workspace with the recommended AWS quickstart method, a Cloudformation template will be la...

AWS quickstart - Cloudformation failureWhen deploying your workspace with the recommended AWS quickstart method, a Cloudformation template will be launched in your AWS account. If you experience a failure with the error message along the lines of ROL...

Screen Shot 2021-10-12 at 11.46.28 AM Screen Shot 2021-10-13 at 3.09.01 PM
  • 4640 Views
  • 4 replies
  • 5 kudos
Latest Reply
yalun
New Contributor III
  • 5 kudos

How do I launch the "Quickstart" again? Where is it in the console?

  • 5 kudos
3 More Replies
Anonymous
by Not applicable
  • 1834 Views
  • 2 replies
  • 16 kudos

Resolved! How to access data files for Databricks Workspace directly through Azure Blob Storage

Hi everyone,This is the first time I used Azure to deploy a Databricks (before I quite familiar use AWS for deploying Databricks). I want to view Databricks Workspace file directly from Azure portal but always don't have a permission.Could you give m...

Image
  • 1834 Views
  • 2 replies
  • 16 kudos
Latest Reply
Unforgiven
Valued Contributor III
  • 16 kudos

@Jensen Ackles​ read document as link below :https://docs.databricks.com/external-data/azure-storage.htmli dont know whats steps ur done, hope this can help u in this case

  • 16 kudos
1 More Replies
horatiug
by New Contributor III
  • 4591 Views
  • 8 replies
  • 3 kudos

Create workspace in Databricks deployed in Google Cloud using terraform

In the documentation https://registry.terraform.io/providers/databricks/databricks/latest/docs https://docs.gcp.databricks.com/dev-tools/terraform/index.html I could not find documentation on how to provision Databricks workspaces in GCP. Only cre...

  • 4591 Views
  • 8 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @horatiu guja​ Does @Debayan Mukherjee​ response answer your question?If yes, would you be happy to mark it as best so that other members can find the solution more quickly? Else, we can help you with more details.

  • 3 kudos
7 More Replies
its-kumar
by New Contributor III
  • 8364 Views
  • 9 replies
  • 4 kudos

Resolved! Error While creating the Databricks workspace on the AWS cloudformation stack

I am getting the following error in databricksAPIFunction resource creation and AWS Stack is failing with the rollback. Resource handler returned message: "Your access has been denied by S3, please make sure your request credentials have permission t...

  • 8364 Views
  • 9 replies
  • 4 kudos
Latest Reply
Vartika
Databricks Employee
  • 4 kudos

Hi @Kumar Shanu​,Thank you for coming back and letting us know.It was really great of you to mark an answer as best and for pointing everyone in the right direction.Have a great Databricks journey ahead! 

  • 4 kudos
8 More Replies
absolutelyRice
by New Contributor III
  • 9301 Views
  • 5 replies
  • 2 kudos

Resolved! Databricks Terraform Provider Issues Passing Providers to Child Modules

I have been following the documentation on the terraform databricks documentation in order to provision account level resources on AWS. I can create the workspace fine, add users, etc... However, when I go to use the provider in non-mws mode, I am re...

  • 9301 Views
  • 5 replies
  • 2 kudos
Latest Reply
absolutelyRice
New Contributor III
  • 2 kudos

So the answer to this was that you need to explicitly pass the provider argument to each of the data resources blocks. The docs should be updated to accommodate that. ​i.e. data "databricks_spark_version" "latest" { provider = databricks.workspace ...

  • 2 kudos
4 More Replies
Trung
by Contributor
  • 1597 Views
  • 2 replies
  • 1 kudos

Resolved! DataBricks best practice to manage resource correspond deleted user

currently I have some prblem about my DataBricks workspace when an user was deleted and it cause some issue:Applications or scripts that use the tokens generated by the user will no longer be able to access the Databricks APIJobs owned by the user wi...

  • 1597 Views
  • 2 replies
  • 1 kudos
Latest Reply
Trung
Contributor
  • 1 kudos

@Vivian Wilfred​ it really useful for my case, many thanks!

  • 1 kudos
1 More Replies
archanarddy
by New Contributor
  • 1208 Views
  • 0 replies
  • 0 kudos

metastore is down

I am trying to run a scala notebook, but my job just spins and says Metastore is down. Can someone help me. Thanks in advance.

  • 1208 Views
  • 0 replies
  • 0 kudos
Labels