Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
Hello Databricks Community,I have successfully migrated my DBFS (Databricks File System) from a source workspace to a target workspace, moving it from a path in Browse DBFS -> Folders to a Catalog -> Schema -> Volume.Now, I want to validate the migra...
Hello,I am planning to deploy a workspace with Unity Catalog enabled. Deploying permissions in one place sounds like a good solution. It can even simplify dataset architecture by masking rows and columns.As an architect, I’m concerned about the user’...
What are some best practices for optimizing Spark jobs in Databricks, especially when dealing large datasets? Any tips or resources would be greatly appreciated! I’m trying to analyze data on restaurant menu prices so that insights would be especiall...
Optimizing Spark jobs in Databricks can significantly enhance performance. Here are some strategies to consider:Efficient Partitioning: Proper partitioning reduces shuffle times, leading to faster data processing.Caching: Utilize Delta caching inste...
You can create Tableau-styled charts without leaving your notebook with just a few lines of code.Imagine this: You’re working within Databricks notebook, trying to explore your Spark/Pandas DataFrame, but visualizing the data or performing Explorator...
Hello Databricks Community,I am currently in the process of migrating Service Principals from a non-Unity workspace to a Unity-enabled workspace in Databricks. While the Service Principals themselves seem to be migrating correctly, I am facing an is...
Do you have the option to open a support ticket? If not the case I would suggest to run an additional code that disable the entitlement for the new objects, as seems that the entitlement is not being properly passed in the original call
Do you require for a Service Principal or a Group to have admin rights to allow automation or reduce the efforts in the process of adding the permission to each user.
Solution
For Service Principals:
You need to be at least Workspace AdminYou can eit...
Hi Team, Can you provide your thoughts on moving Databricks from Azure to GCP? What services are required for the migration, and are there any limitations on GCP compared to Azure? Also, are there any tools that can assist with the migration? Please ...
Hello Team,
Adding to @sunnydata comments:
Moving Databricks from Azure to GCP involves several steps and considerations. Here are the key points based on the provided context:
Services Required for Migration:Cloud Storage Data: Use GCP’s Storage T...
Did the SparkUI training yesterday with Mark Ott, and I highly recommend it. It was super helpful and provided a lot of clarity around some of the vaguer terms and metrics, and some surprise penalties.In-memory partition size is the the main thing to...
Hi Team,What is the best way to transfer Talend ETL code to Databricks and what are the best methods/practices for migrating Talend ETL's to Databricks (notebook, code conversion/migration strategy, workflow's etc)?Regards,Janga
if you're planning to migrate from your current technology to Databricks, Travinto Technologies' Code Converter Tool is here to make the process seamless. This powerful tool enables you to migrate data, ETL workflows, and reports across platforms eff...
Hello Databricks Community,I am a beginner with Databricks. I am wondering if we can download power point slides or learning documents from the Databricks Learning Platform. I like to read after taking the online course. Could you let me know? Curren...
Hi All,We have a situation where we write data to CosmosDB and create JSON data for a transaction table, which includes a mini statement in JSON format.Now, we want to introduce the concept of delta sharing and share the transaction table. The Java ...
Thanks for your reply,Right now, the team is transferring data from Databricks to Cosmos DB, and then they're using REST APIs to access that data. They handle about 100 requests per minute, with some tables needing around 100 requests per second due...
Hi,If I connect databricks (trial version) with AWS/Azure/Google Cloud and then work on dashboards and Genie - will there be any minimal charges, or its completely free to use the cloud services?
I am trying to connect with Datagrip provided driver. I am not getting this to work with token from datagrips. The connection url is: jdbc:databricks://dbc-******.cloud.databricks.com:443/***_analytics;httpPath=/sql/1.0/warehouses/ba***3
I am gettin...
hi @Alberto_Umana thanks. I created the token in databricks under User Settings > Access Tokens indeed. Not sure how to ensure is valid and has the necessary permissions to access the Databricks SQL warehouse. I generated it recently though.
Hi Team,We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power B...
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.