cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jermaineharsh
by New Contributor III
  • 1308 Views
  • 0 replies
  • 1 kudos

How to switch from free trial to Community Edition of Databricks in my Azure workspace?

hello,I am trying to switch into Databricks Community Edition after a 14 day trial. I was able to register, but when I try to start my new cluster, I get an error message, "Cluster start feature is currently disabled, and the cluster does not run".In...

  • 1308 Views
  • 0 replies
  • 1 kudos
Picci
by New Contributor III
  • 5193 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks community edition still available?

Is Databricks platform still available in its Community edition (outside Azure, AWS or GCP)? Can someone share the updated link?Thanks,Elisa

  • 5193 Views
  • 3 replies
  • 3 kudos
Latest Reply
jamescw
New Contributor II
  • 3 kudos

Look : it is still available but afaik always linked to azure/gcp/aws

  • 3 kudos
2 More Replies
bento
by New Contributor
  • 16331 Views
  • 1 replies
  • 1 kudos

Resolved! Notebook Langchain ModuleNotFoundError: No module named 'langchain.retrievers.merger_retriever'

Hi,As mentioned in the title, receiving this error despite%pip install --upgrade langchainSpecific line of code:from langchain.retrievers.merger_retriever import MergerRetriever All other langchain import works when this is commented out. Same line w...

  • 16331 Views
  • 1 replies
  • 1 kudos
Latest Reply
sean_owen
Databricks Employee
  • 1 kudos

More specifically: langchain releases a new update every few days, and it is likely that you are using code or a library that needs a later version of langchain than you have (or, perhaps, a later version that removed whatever part of langchain you r...

  • 1 kudos
ankit_batra1
by New Contributor
  • 2139 Views
  • 2 replies
  • 1 kudos

Databricks notebook execution using only one task

I am running a databricks notebook. While running, i only see one task on one worker getting started. My cluster has min 6 workers but seems like they are not getting used.I am performing a read operation from Cosmos DB.Can someone please help me her...

  • 2139 Views
  • 2 replies
  • 1 kudos
Latest Reply
sean_owen
Databricks Employee
  • 1 kudos

If your code does not use Spark, it will not use any machines except the driver. If you're using Spark but your source data that you operate on has 1 partition, there will be only 1 task. Hard to say more without knowing what you are doing in more de...

  • 1 kudos
1 More Replies
FutureLegend
by New Contributor III
  • 5337 Views
  • 2 replies
  • 1 kudos

MIT License and Fine-tuning

Some questions related to fine-tuning and the MIT License, I read the MIT license but still confusing about some points.If I fine-tune the Dolly-v2 model, say using LoRA and my own dataset,Do I "own" the fine-tuned model?Am I allow to change the name...

  • 5337 Views
  • 2 replies
  • 1 kudos
Latest Reply
sean_owen
Databricks Employee
  • 1 kudos

I am not sure I agree with the discussion so far. While none of here are lawyers, I think it's fairly straightforward to reason about the licensing.You have created a combined, derivative work from the Dolly weights in this case. You have copyright i...

  • 1 kudos
1 More Replies
meystingray
by New Contributor II
  • 1568 Views
  • 0 replies
  • 0 kudos

Databricks Rstudio Init Script Deprecated

OK so I'm trying to use Open Source Rstudio on Azure Databricks.I'm following the instructions here: https://learn.microsoft.com/en-us/azure/databricks/sparkr/rstudio#install-rstudio-server-open-source-editionI've installed the necessary init script ...

  • 1568 Views
  • 0 replies
  • 0 kudos
dannythermadom
by New Contributor III
  • 1776 Views
  • 2 replies
  • 1 kudos

Getting FileNotFoundException while using cloudFiles

Hi,Following is the code i am using the ingest the data incrementally (weekly).val ssdf = spark.readStream.schema(schema) .format("cloudFiles").option("cloudFiles.format", "parquet").load(sourceUrl).filter(criteriaFilter)val transformedDf = ssdf.tran...

  • 1776 Views
  • 2 replies
  • 1 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 1 kudos

Danny is another process mutating / deleting the incoming files? 

  • 1 kudos
1 More Replies
Ruby8376
by Valued Contributor
  • 7065 Views
  • 6 replies
  • 3 kudos

Resolved! Using databricks for end to end flow? rather than using ADF for extracting data

Currently, in our company we are using ADF+DATABRICKS for all batch integration. Using ADF first data is copied to ADLS gen 2 (from different sources like on prem servers, ftp solution file sharing, etc), then it is reformatted to csv and it is copie...

  • 7065 Views
  • 6 replies
  • 3 kudos
Latest Reply
Ruby8376
Valued Contributor
  • 3 kudos

@-werners- Is there any benefit of doing the extract part in databricks itself? Unlike our current architecture, where we first load to adls using adf. I guess it is worth doing all end to end using databricks if there is better processing, lower lat...

  • 3 kudos
5 More Replies
Phani1
by Valued Contributor II
  • 1346 Views
  • 1 replies
  • 0 kudos

RBAC, Security & Privacy controls

Could you please share us best practices on implementation of RBAC, Security & Privacy controls in Databricks

Get Started Discussions
Privacy controls
RBAC
  • 1346 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please check on https://docs.databricks.com/en/lakehouse-architecture/security-compliance-and-privacy/best-practices.html and see if this helping? Also, please tag @Debayan with your next comment which will notify me. Thanks!

  • 0 kudos
Policepatil
by New Contributor III
  • 1956 Views
  • 0 replies
  • 0 kudos

Records are missing while creating new dataframe from one big dataframe using filter

Hi,I have data in file like belowI have different types of row in my input file, column number 8 defines the type of the record.In the above file we have 4 types of records 00 to 03My requirement is:There will be multiple files in the source path, ea...

Policepatil_0-1693826562540.png Policepatil_1-1693826571781.png Policepatil_2-1693826609156.png Policepatil_3-1693826641543.png
  • 1956 Views
  • 0 replies
  • 0 kudos
Policepatil
by New Contributor III
  • 2376 Views
  • 2 replies
  • 0 kudos

Records are missing while creating new data from one big dataframe using filter

Hi,I have data in file like below I have different types of row in my input file, column number 8 defines the type of the record.In the above file we have 4 types of records 00 to 03My requirement is:There will be multiple files in the source path, e...

Policepatil_1-1693806659492.png Policepatil_3-1693806860560.png Policepatil_4-1693807544901.png Policepatil_5-1693807898507.png
  • 2376 Views
  • 2 replies
  • 0 kudos
Latest Reply
Policepatil
New Contributor III
  • 0 kudos

Hi @Retired_mod ,If i run again with same files sometimes records will be missed from same files of the previous run or records will be missed from different file.Example:run1: 1 record missing in file1, no issue with other filesrun2: 1 record missin...

  • 0 kudos
1 More Replies
6502
by New Contributor III
  • 3381 Views
  • 0 replies
  • 0 kudos

Dashboard backup/download

Hola all, I'm trying to download all the dashboard definitions, however, I can only download the folder structure with no file inside. The procedure I'm using is* Go to the dashboard folder* Download it as DBC or source archiveUnfortunately, the DBC ...

  • 3381 Views
  • 0 replies
  • 0 kudos
seefoods
by Contributor
  • 1133 Views
  • 1 replies
  • 0 kudos

ganglia metrics

Hello Everyone, I have build this script in order to collect ganglia metrics but the size of stderr and sdtout ganglia is 0. It doesn't work. I Have put this script on Workspace due to migration databricks all init-script should be place on Workspace...

  • 1133 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Is there any error you are getting? Also, please tag @Debayan with your next comment which will notify me. Thanks!

  • 0 kudos
MetaMaestro
by New Contributor III
  • 1450 Views
  • 1 replies
  • 0 kudos

GCP hosted Databricks - DBFS temp files - Not Found

I've been working on obtaining DDL at the schema level in Hive Metastore within GCP-hosted Databricks. I've implemented a Python code that generates SQL files in the dbfs/temp directory. However, when running the code, I'm encountering a "file path n...

  • 1450 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, the error code snippet with the whole error may help to determine the issue, also, considering the above points may also work as a fix. 

  • 0 kudos
Rajan_651
by New Contributor
  • 2336 Views
  • 1 replies
  • 0 kudos

Unable to find permission button in Sql Warehouse for providing Access

Hi Everyone, am unable to see the permission button in sql warehouse to provide access to other users.I have admin rights and databricks is premium subscription.  

  • 2336 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please provide a screenshot of the SQL warehouse? Also, you can go through: https://docs.databricks.com/en/security/auth-authz/access-control/sql-endpoint-acl.htmlAlso, please tag @Debayan with your next comment which will notify me. Th...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels