- 7086 Views
- 3 replies
- 3 kudos
Resolved! Databricks community edition still available?
Is Databricks platform still available in its Community edition (outside Azure, AWS or GCP)? Can someone share the updated link?Thanks,Elisa
- 7086 Views
- 3 replies
- 3 kudos
- 3 kudos
Look : it is still available but afaik always linked to azure/gcp/aws
- 3 kudos
- 19881 Views
- 1 replies
- 1 kudos
Resolved! Notebook Langchain ModuleNotFoundError: No module named 'langchain.retrievers.merger_retriever'
Hi,As mentioned in the title, receiving this error despite%pip install --upgrade langchainSpecific line of code:from langchain.retrievers.merger_retriever import MergerRetriever All other langchain import works when this is commented out. Same line w...
- 19881 Views
- 1 replies
- 1 kudos
- 1 kudos
More specifically: langchain releases a new update every few days, and it is likely that you are using code or a library that needs a later version of langchain than you have (or, perhaps, a later version that removed whatever part of langchain you r...
- 1 kudos
- 2513 Views
- 2 replies
- 1 kudos
Databricks notebook execution using only one task
I am running a databricks notebook. While running, i only see one task on one worker getting started. My cluster has min 6 workers but seems like they are not getting used.I am performing a read operation from Cosmos DB.Can someone please help me her...
- 2513 Views
- 2 replies
- 1 kudos
- 1 kudos
If your code does not use Spark, it will not use any machines except the driver. If you're using Spark but your source data that you operate on has 1 partition, there will be only 1 task. Hard to say more without knowing what you are doing in more de...
- 1 kudos
- 6035 Views
- 2 replies
- 1 kudos
MIT License and Fine-tuning
Some questions related to fine-tuning and the MIT License, I read the MIT license but still confusing about some points.If I fine-tune the Dolly-v2 model, say using LoRA and my own dataset,Do I "own" the fine-tuned model?Am I allow to change the name...
- 6035 Views
- 2 replies
- 1 kudos
- 1 kudos
I am not sure I agree with the discussion so far. While none of here are lawyers, I think it's fairly straightforward to reason about the licensing.You have created a combined, derivative work from the Dolly weights in this case. You have copyright i...
- 1 kudos
- 1766 Views
- 0 replies
- 0 kudos
Databricks Rstudio Init Script Deprecated
OK so I'm trying to use Open Source Rstudio on Azure Databricks.I'm following the instructions here: https://learn.microsoft.com/en-us/azure/databricks/sparkr/rstudio#install-rstudio-server-open-source-editionI've installed the necessary init script ...
- 1766 Views
- 0 replies
- 0 kudos
- 2136 Views
- 2 replies
- 1 kudos
Getting FileNotFoundException while using cloudFiles
Hi,Following is the code i am using the ingest the data incrementally (weekly).val ssdf = spark.readStream.schema(schema) .format("cloudFiles").option("cloudFiles.format", "parquet").load(sourceUrl).filter(criteriaFilter)val transformedDf = ssdf.tran...
- 2136 Views
- 2 replies
- 1 kudos
- 1 kudos
Danny is another process mutating / deleting the incoming files?
- 1 kudos
- 8957 Views
- 6 replies
- 3 kudos
Resolved! Using databricks for end to end flow? rather than using ADF for extracting data
Currently, in our company we are using ADF+DATABRICKS for all batch integration. Using ADF first data is copied to ADLS gen 2 (from different sources like on prem servers, ftp solution file sharing, etc), then it is reformatted to csv and it is copie...
- 8957 Views
- 6 replies
- 3 kudos
- 3 kudos
@-werners- Is there any benefit of doing the extract part in databricks itself? Unlike our current architecture, where we first load to adls using adf. I guess it is worth doing all end to end using databricks if there is better processing, lower lat...
- 3 kudos
- 1645 Views
- 1 replies
- 0 kudos
RBAC, Security & Privacy controls
Could you please share us best practices on implementation of RBAC, Security & Privacy controls in Databricks
- 1645 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Could you please check on https://docs.databricks.com/en/lakehouse-architecture/security-compliance-and-privacy/best-practices.html and see if this helping? Also, please tag @Debayan with your next comment which will notify me. Thanks!
- 0 kudos
- 2174 Views
- 0 replies
- 0 kudos
Records are missing while creating new dataframe from one big dataframe using filter
Hi,I have data in file like belowI have different types of row in my input file, column number 8 defines the type of the record.In the above file we have 4 types of records 00 to 03My requirement is:There will be multiple files in the source path, ea...
- 2174 Views
- 0 replies
- 0 kudos
- 2806 Views
- 2 replies
- 0 kudos
Records are missing while creating new data from one big dataframe using filter
Hi,I have data in file like below I have different types of row in my input file, column number 8 defines the type of the record.In the above file we have 4 types of records 00 to 03My requirement is:There will be multiple files in the source path, e...
- 2806 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod ,If i run again with same files sometimes records will be missed from same files of the previous run or records will be missed from different file.Example:run1: 1 record missing in file1, no issue with other filesrun2: 1 record missin...
- 0 kudos
- 3780 Views
- 0 replies
- 0 kudos
Dashboard backup/download
Hola all, I'm trying to download all the dashboard definitions, however, I can only download the folder structure with no file inside. The procedure I'm using is* Go to the dashboard folder* Download it as DBC or source archiveUnfortunately, the DBC ...
- 3780 Views
- 0 replies
- 0 kudos
- 1471 Views
- 1 replies
- 0 kudos
ganglia metrics
Hello Everyone, I have build this script in order to collect ganglia metrics but the size of stderr and sdtout ganglia is 0. It doesn't work. I Have put this script on Workspace due to migration databricks all init-script should be place on Workspace...
- 1471 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Is there any error you are getting? Also, please tag @Debayan with your next comment which will notify me. Thanks!
- 0 kudos
- 1723 Views
- 1 replies
- 0 kudos
GCP hosted Databricks - DBFS temp files - Not Found
I've been working on obtaining DDL at the schema level in Hive Metastore within GCP-hosted Databricks. I've implemented a Python code that generates SQL files in the dbfs/temp directory. However, when running the code, I'm encountering a "file path n...
- 1723 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, the error code snippet with the whole error may help to determine the issue, also, considering the above points may also work as a fix.
- 0 kudos
- 2729 Views
- 1 replies
- 0 kudos
Unable to find permission button in Sql Warehouse for providing Access
Hi Everyone, am unable to see the permission button in sql warehouse to provide access to other users.I have admin rights and databricks is premium subscription.
- 2729 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Could you please provide a screenshot of the SQL warehouse? Also, you can go through: https://docs.databricks.com/en/security/auth-authz/access-control/sql-endpoint-acl.htmlAlso, please tag @Debayan with your next comment which will notify me. Th...
- 0 kudos
- 1226 Views
- 1 replies
- 1 kudos
SQL Serverless - cost view
Hi,Anyone knows how I'm able to monitor cost of the SQL Serverless? I'm using Databricks in Azure and I'm not sure where to find cost generated by compute resources hosted on Databricks.
- 1226 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi, You can calculate the pricing in https://www.databricks.com/product/pricing/databricks-sql also, https://azure.microsoft.com/en-in/pricing/details/databricks/#:~:text=Sign%20in%20to%20the%20Azure,asked%20questions%20about%20Azure%20pricing. For A...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
19 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 128 | |
| 72 | |
| 57 | |
| 42 |