- 8526 Views
- 2 replies
- 2 kudos
Databricks Asia-Pacific LLM Cup 2023 - So you think you can hack?
You are invited to participate in Databricks Asia-Pacific LLM Cup 2023, an exciting virtual hackathon which kicks off in the month of October. Businesses across Asia-Pacific and Japan will have the opportunity to build, iterate and polish your LLM id...
- 8526 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi! This sounds like a great opportunity for anyone interested in machine learning and wanting to develop their skills through real projects. B If you are looking for additional resources to promote your ideas or want to increase the effectiveness of...
- 2 kudos
- 2399 Views
- 4 replies
- 2 kudos
Editor lags
Does anyone else experiencing lags on the Databricks notebook editor?The typing is sometimes very slow causes the cell to hang for a moment. (Using python)
- 2399 Views
- 4 replies
- 2 kudos
- 2 kudos
The notebook editor has been sluggish for me regardless of splitting up between cells and the language in use. Very frustrating.
- 2 kudos
- 457 Views
- 2 replies
- 0 kudos
CICD Unity Catalog
Hello How do you handle deploying Databricks Unity Catalog resources (schemas, tables, views, permissions) across environments? What are the strategies for building (compiling), validating and deploying these resources and ensure they’re error-free a...
- 457 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks, yes DAB can be used, but I'm still wondering how to validate the syntax, dependencies, permissions, etc. before deploying the DAB. Example: when you deploy an SQL Database DACBAC file, first you need to build the project and generate a DACBAC...
- 0 kudos
- 503 Views
- 3 replies
- 2 kudos
Resolved! Specify cluster when using dbutils.notebook API
Hello! Anyone know if there is a way of specifying cluster when using dbutils.notebook.run() ? As i understand it, this command will create a job compute on the run? but what if I want to use my general purpose cluster? I have been thinking of %run b...
- 503 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @ErikJ ,Check the docs:Also you may run the help method:.The parameters of the method are:1. path2. timeoutSeconds3. arguments
- 2 kudos
- 454 Views
- 2 replies
- 0 kudos
Lake house monitoring prodile_metrics and drift_metrics are empty
I am wondering why when i create a lakehouse monitor using either the API or UI, that my profile_metrics and data_drift tables are empty. This error came after i changed my date from type date to timestamp. Any information would be greatly appreciate...
- 454 Views
- 2 replies
- 0 kudos
- 0 kudos
The issue comes where you don't include data with a timestamp that's within the last 30 days, as the analysis done on the table is only done within the last 30 days worth of data.
- 0 kudos
- 3529 Views
- 4 replies
- 2 kudos
Cluster compute metrics
I want to fetch compute metrics(hardware, gpu and spark) and use them in certain dashboard on Databricks, however i'm not able to fetch them. i have tried GET API request and system tables. The system tables only have CPU utilization and memory utili...
- 3529 Views
- 4 replies
- 2 kudos
- 2 kudos
Can you tell what 3rd party tools you are referring to?
- 2 kudos
- 123 Views
- 0 replies
- 0 kudos
deleteing old databricks workspace but keeping the data
Hi! I am currently working on a old databricks workspace where we have stored a lot of data, we have migrated over to a new workspace and i want to delete the old one. Before deleting the workspace i want to store the data in a blob storage in azure....
- 123 Views
- 0 replies
- 0 kudos
- 3700 Views
- 9 replies
- 2 kudos
Create a Workflow Schedule with varying Parameters
We aim to reduce the amount of notebooks we create to a minimum and instead make these fairly flexible. Therefore we have a Factory setup that takes in a parameter to varies the logic.However when it comes to Workflows we are forced to create multipl...
- 3700 Views
- 9 replies
- 2 kudos
- 2 kudos
We're also running into this issue on my team where having multiple cron schedules would be handy. We have some pipelines that we want run on multiple schedules, say to refresh data "Run every Sunday at midnight" and "Run on the first day of the mont...
- 2 kudos
- 298 Views
- 0 replies
- 0 kudos
Unable to login to community addition account
Hello, I created a community edition account yesterday. Earlier today, I was still able to sign in and everything worked as expected. Now, whenever I try to sign in, I get the following error: "We were not able to find a Community Edition workspace w...
- 298 Views
- 0 replies
- 0 kudos
- 268 Views
- 0 replies
- 1 kudos
Accessing data in Databricks/Unity Catalog from a SharePoint list
Our organization is moving all of it's corporate master data to Databricks. Our business operations units need access to this data in various tools in the form of look-ups or drop-down lists in tools like Excel and SharePoint lists. It's simple eno...
- 268 Views
- 0 replies
- 1 kudos
- 4120 Views
- 7 replies
- 1 kudos
log signature and input data for Spark LinearRegression
I am looking for a way to log my `pyspark.ml.regression.LinearRegression` model with input and signature ata. The usual example that I found around are using sklearn and they can simply do # Log the model with signature and input example signature =...
- 4120 Views
- 7 replies
- 1 kudos
- 1 kudos
@MohsenJ @javierbg @Abi105 I have found a solution to this issue as I was trying to deploy Spark ML Models to Unity Catalog. Please view my blog and let me know if it helps solve your issues! https://medium.com/p/7d04e8539540
- 1 kudos
- 743 Views
- 1 replies
- 0 kudos
FAILED_READ_FILE.NO_HINT error
We read data from csv in the volume into the table using COPY INTO. The first 200 files were added without problems, but now we are no longer able to add any new data to the table and the error is FAILED_READ_FILE.NO_HINT. The CSV format is always th...
- 743 Views
- 1 replies
- 0 kudos
- 0 kudos
Py4JJavaError: An error occurred while calling o392.sql. : org.apache.spark.SparkException: [FAILED_READ_FILE.NO_HINT] Error while reading file dbfs:/Volumes/...txt. SQLSTATE: KD001 at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotReadFiles...
- 0 kudos
- 153 Views
- 0 replies
- 0 kudos
Account Level API GET Call requiring User-Agent Header
We recently Enabled Unity Catalog so using Account Level API and little stuck on a GET call. For some reason, just the GET call requires a User-Agent header while POST, PUT, PATCH.. all work without it. For workspace API, there was no need for User-A...
- 153 Views
- 0 replies
- 0 kudos
- 683 Views
- 1 replies
- 1 kudos
Summit23
I have really enjoyed the summit so far! I am a biostatistics graduate student at Harvard and am learning so much here.
- 683 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi,If you had a great time at the Summit, be sure to visit our community events portal for upcoming events and join your regional user group for meetups.Thanks,Anushree
- 1 kudos
- 842 Views
- 1 replies
- 0 kudos
Summit 2024
It has been an exciting event that brought together industry leaders, researchers, and practitioners to explore the latest advancements in artificial intelligence and data science. With cutting-edge presentations, hands-on workshops, and networking o...
- 842 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi,If you had a great time at the Summit, be sure to visit our community events portal for upcoming events and join your regional user group for meetups.Thanks,Anushree
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »