cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Nietzsche
by New Contributor III
  • 2637 Views
  • 2 replies
  • 1 kudos

Resolved! Why Databricks Free Edition asks for my login every single time I tries to log on?

Hello all,everytime I try to log into Databricks Free Edition, I have to sign in with my email and password and then verification code via my email account.It gets really tiresome after a while.Is there a way for Databricks to remember my account log...

  • 2637 Views
  • 2 replies
  • 1 kudos
Latest Reply
Nietzsche
New Contributor III
  • 1 kudos

Thank you.

  • 1 kudos
1 More Replies
Meghna18
by New Contributor
  • 897 Views
  • 1 replies
  • 1 kudos

Sandbox environment

Hello,I wanted to have a hands-on experience on databricks platform and need sandbox environment for same. Please let me know if we have any such platform to practice, learn and explore databricks concepts.

  • 897 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Meghna18 ,Sure, you can use databricks free edition for that purpose. Databricks Free Edition is a no-cost version of Databricks designed for students, educators, hobbyists, and anyone interested in learning or experimenting with data and AI.Free...

  • 1 kudos
LulutoxSe
by New Contributor
  • 406 Views
  • 0 replies
  • 0 kudos

Lulutox: Hemligheten bakom naturlig viktminskning och ökad energi

Lulutox Oavsett om du precis har börjat din viktminskningsresa eller vill bryta igenom en platå, erbjuder Lulutox en väl avrundad, kroppspositiv väg till transformation. Det är mer än bara ett piller – det är ett verktyg för att återta kontrollen öve...

  • 406 Views
  • 0 replies
  • 0 kudos
Sujitha
by Databricks Employee
  • 25155 Views
  • 6 replies
  • 14 kudos

Join the Databricks Champions program

The Databricks Champions program is exclusively for current Databricks customers. It provides you with opportunities to: Share your Databricks storyBe recognized for the impact of your workPromote your personal brand to progress your careerAid your r...

Community Ad_July 2023 (1) (1).jpg
  • 25155 Views
  • 6 replies
  • 14 kudos
Latest Reply
deepakhari
New Contributor II
  • 14 kudos

Could not register with this link.

  • 14 kudos
5 More Replies
GRV
by New Contributor II
  • 873 Views
  • 2 replies
  • 0 kudos

Data import.

i am using the databricks free edition and when i am importing a csv file i am not able to access that csv file in using spark commands(when i am creating a dataframe) it says "Public DBFS root is disabled." and i can't find any "data" tab in my side...

  • 873 Views
  • 2 replies
  • 0 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 0 kudos

Hello @GRV Regarding DBFS, there is no official update on enabling full access in the Free Edition. Also, DBFS is now considered a legacy approach, and you would need to use Unity Catalog Volumes for storing and accessing data going forward is recomm...

  • 0 kudos
1 More Replies
yugz
by New Contributor III
  • 3815 Views
  • 9 replies
  • 2 kudos

Spark configuration to access Azure Blob or ADLS

I am new to Databricks, I am using the free edition of Databricks. I have tried [spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.win...

  • 3815 Views
  • 9 replies
  • 2 kudos
Latest Reply
nayan_wylde
Honored Contributor III
  • 2 kudos

This is an old way of accessing data lake. With the free edition and serverless it is not supported. Try creating an external credential and external location to the data lake.https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql...

  • 2 kudos
8 More Replies
Amit110409
by New Contributor
  • 733 Views
  • 1 replies
  • 0 kudos

ask to how to use databricks community version

Hi, guys, as I am new to Databricks, I use Databricks free tier. But I want to explore it more and practice more, so. How do we get access to the community version? Because when I go to the community version, it redirects me to the free edition, but ...

  • 733 Views
  • 1 replies
  • 0 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 0 kudos

Hello @Amit110409 ,Databricks has announced the free edition version of Databricks. It contains much more capabilities than the community edition.Regarding DBFS, there is no official update on enabling full access in the Free Edition. Also, DBFS is n...

  • 0 kudos
Roger667
by New Contributor
  • 2970 Views
  • 2 replies
  • 0 kudos

Converting managed table to external tables

I have some managed tables in catalog which i plan to convert to external tables. But i want preserve the version history of the tables as well. I have tried deep cloning but it builds the external table as version 0. Is there any way i can achieve t...

  • 2970 Views
  • 2 replies
  • 0 kudos
Latest Reply
ElizabethB
Databricks Employee
  • 0 kudos

I'm curious to know, what is the reason why you are looking to convert from managed to external? Most customers are looking to convert from external to managed, since UC managed tables help them reduce storage costs and increase query speeds. If ther...

  • 0 kudos
1 More Replies
Upendra_Dwivedi
by Contributor
  • 1367 Views
  • 1 replies
  • 0 kudos

Azure File Share Connect with Databricks

Hi All,I am working on a task where i need to access Azure File Share from Databricks and move files from there to storage account blob container.I found one solution which is to use azure-file-share python package and it needs SAS token. But i don't...

  • 1367 Views
  • 1 replies
  • 0 kudos
Latest Reply
Omerabbasi
New Contributor II
  • 0 kudos

I think you are on the right track but getting a bit more granular. Once the Azure File Share is mounted,use Spark to move the data from the source path and write it to a blob container. 

  • 0 kudos
sue01
by New Contributor II
  • 3435 Views
  • 3 replies
  • 0 kudos

Error with using Vector assembler in Unity Catalog

Hello,I am getting the below error while trying to convert my features using vector assembler in unity catalog cluster I tried setting up the config like mentioned in a different post, but it did not work still. Could use some help here.Thank you..

sue01_0-1708093249414.png
Get Started Discussions
unitycatalog mlflowerror
  • 3435 Views
  • 3 replies
  • 0 kudos
Latest Reply
ankur2917
New Contributor II
  • 0 kudos

I am stuck on the same issue, it does not make any sense to keep these blocked by UC. Let me know if someone got any solution of this issue.

  • 0 kudos
2 More Replies
billyboy
by New Contributor II
  • 725 Views
  • 1 replies
  • 0 kudos

Does Databricks run it's own compute clusters?

Rather new to Databricks so I understand this might be a silly question, but from what I understand so far Databricks leverages Spark for parallelized computation-but when we create a compute is it using the compute power from whatever cloud provider...

  • 725 Views
  • 1 replies
  • 0 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 0 kudos

Hello billyboy,You can start it off by looking in their official architecture documentation: https://learn.microsoft.com/en-us/azure/databricks/getting-started/overviewAnd next this is the article I like, that goes in more details: https://www.accent...

  • 0 kudos
RajaDOP
by New Contributor
  • 3928 Views
  • 2 replies
  • 0 kudos

How to create a mount point to File share in Azure Storage account

Hello All,I have a requirement to create a mount point to file share in Azure Storage account, I did follow the official documentation. However, I could not create the mount point to fileshare.. and the documentation discribed the mount point creatio...

  • 3928 Views
  • 2 replies
  • 0 kudos
Latest Reply
Aaaddison
New Contributor II
  • 0 kudos

Hi Raja,You're correct that the wasbs:// method is for Azure Blob Storage, not File Shares! I believe File Share mounting is different and would require you to use SMB protocol mounted outside of Databricks since File Shares isn't natively supported!...

  • 0 kudos
1 More Replies
amitkumarvish
by New Contributor II
  • 2080 Views
  • 4 replies
  • 3 kudos

Databricks Apps Deployment with React codebase

Hi,Need help to understand how we can deploy frontend(React) codebase via Databricks Apps as I am tried all templates and Custom App creation. It seems only Python based codebase can be deployed.Let me know if anyone can help me with approach or fesi...

Get Started Discussions
Databricks Apps
  • 2080 Views
  • 4 replies
  • 3 kudos
Latest Reply
cgrant
Databricks Employee
  • 3 kudos

Here is some new documentation for using node.js with Databricks apps.

  • 3 kudos
3 More Replies
Tarun-1100
by New Contributor
  • 1787 Views
  • 1 replies
  • 0 kudos

Integrate Genie to teams

Hey, I'm trying to integrate Genie into Teams. I am Admin and have all rights. created a Genie to test. We are encountering an PermissionDeniederror while interacting with Genie API via SDK and workspace token.Details:Workspace URL: https://dbc-125a3...

Tarun1100_0-1745945967915.png
Get Started Discussions
API
Genie
integration
  • 1787 Views
  • 1 replies
  • 0 kudos
Latest Reply
chanukya-pekala
Contributor III
  • 0 kudos

Check this repo -  TeamsGenieIntegration

  • 0 kudos
Dimitry
by Contributor III
  • 2281 Views
  • 4 replies
  • 0 kudos

Resolved! UDF fails with "No module named 'dbruntime'" when using dbutils

I've got an UDF which I call using applyInPandasThat UDF is to distribute API calls.It uses my custom .py library files that make these calls.Everything worked until I use `dbutils.widgets.get` and `dbutils.secrets.get` inside these libraries.It thro...

  • 2281 Views
  • 4 replies
  • 0 kudos
Latest Reply
df_dbx
New Contributor II
  • 0 kudos

Answering my own question. Similar to the original response, the answer was to pass in the secret as a function argument:CREATE OR REPLACE FUNCTION geocode_address(address STRING, api_key STRING) RETURNS STRUCT<latitude: DOUBLE, longitude: DOUBLE> ...

  • 0 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels