- 975 Views
- 2 replies
- 0 kudos
Error ingesting files with databricks jobs
The source path that i want to ingest files with is:"gs://bucket-name/folder1/folder2/*/*.json"I have a file in this path that ends with ".json.gz" and the databricks job ingests this file even though it doesn't suppose to.How can i fix it?Thanks.
- 975 Views
- 2 replies
- 0 kudos
- 1473 Views
- 1 replies
- 0 kudos
DataBricks Certification Exam Got Suspended. Require support for the same.
Hello Team, I encountered Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam, saying I have exceeded ...
- 1473 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @surya_1527 , Thank you for posting your concern on Community! To expedite your request, please list your concerns on our ticketing portal. Our support staff would be able to act faster on the resolution (our standard resolution time is 24-48 hour...
- 0 kudos
- 1623 Views
- 1 replies
- 0 kudos
How to search on empty string on text filter with Lakeview Dashboards
Hi,I have created a lakeview dashboard with a couple of filters and a table. Now I would like to search if a certain filter (column) has an empty string but if I search for ' ' then it goes 'no data'. I am wondering how can I search for an empty stri...
- 1623 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Siebert_Looije , To search for an empty string in a text field filter in the Databricks Lakehouse Table view, you can use the "Null" filter option, which also includes empty strings. Here's how you can filter out rows with empty string values in...
- 0 kudos
- 2079 Views
- 3 replies
- 1 kudos
Add Oracle Jar to Databricks cluster policy
I created a policy for users to use when they create their own Job clusters. When I'm editing the policy, I don't have the UI options for adding library (I can only see Definitions and Permissions tabs). I need to add via JSON the option to allows th...
- 2079 Views
- 3 replies
- 1 kudos
- 1 kudos
@adrianhernandez are you admin to workspace, if not you might be missing permissions, if you have policies enabled, admin can allow you.https://docs.databricks.com/en/administration-guide/clusters/policies.html#librariesif your workspace is Unity cat...
- 1 kudos
- 8177 Views
- 1 replies
- 2 kudos
Databricks Asia-Pacific LLM Cup 2023 - So you think you can hack?
You are invited to participate in Databricks Asia-Pacific LLM Cup 2023, an exciting virtual hackathon which kicks off in the month of October. Businesses across Asia-Pacific and Japan will have the opportunity to build, iterate and polish your LLM id...
- 8177 Views
- 1 replies
- 2 kudos
- 2 kudos
7 Up 7 Down: Elevate Your Casino Experience with Vijaybet Online CasinoAre you ready to take your online casino gaming to a whole new level? If you're seeking a thrilling and immersive casino experience, look no further than Vijaybet Online Casino. O...
- 2 kudos
- 1455 Views
- 0 replies
- 2 kudos
dbutils.fs.ls MAX_LIST_SIZE_EXCEEDED
Hi!I'm experiencing different behaviours between two DBX Workspaces when trying to list file contents from an abfss: location.In workspace A running len(dbutils.fs.ls('abfss://~~@~~~~.dfs.core.windows.net/~~/')) results in "Out[1]: 1551", while runni...
- 1455 Views
- 0 replies
- 2 kudos
- 2355 Views
- 5 replies
- 1 kudos
getArgument works fine in interactive cluster 10.4 LTS, raises error in interactive cluster 10.4 LTS
Hello,I am trying to use the getArgument() function in a spark.sql query. It works fine if I run the notebook via an interactive cluster, but gives an error when executed via a job run in an instance Pool.query:OPTIMIZE <table>where date = replace(re...
- 2355 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi @Kaniz_Fatma,Would you be able to respond to my last comment? I couldn't manage to get it working yet.Thank you in advance.
- 1 kudos
- 1793 Views
- 1 replies
- 0 kudos
AWS Databricks VS AWS EMR
HiWhich services should I use for data lake implementation?any cost comparison between Databricks and aws emr.which one is best to choose
- 1793 Views
- 1 replies
- 0 kudos
- 0 kudos
@AH that depends on use case, if your implementation involves Data Lake, ML, Data engineering tasks better to go with databricks as it has got good UI and there good governance using unity catalog for your data lake and you have good consumer tool su...
- 0 kudos
- 1645 Views
- 1 replies
- 1 kudos
Resolved! System billing usage table - Usage column
Hello experts,Could someone please explain what is exactly contained into the column usage in the system.billing.usage table?We ran specific queries in a cluster trying to calculate the cost and we observe that the DBUs shown in the system table are ...
- 1645 Views
- 1 replies
- 1 kudos
- 1 kudos
@elgeo both should be same, untill if somehow we miss to pick proper plan DBU price, usage column will have complete information related to sku name and DBU units etc... if you use azure databricks calculator and compare we should see similar result
- 1 kudos
- 2614 Views
- 5 replies
- 2 kudos
Roadmap on export menu option for SQL Query and Dashboard Types in Workspace
Are there plans for an export option for SQL Query and SQL Dashboard in the Workspace explorer screen similar to notebooks?Background: Need a way to export and backup any queries and dashboards to save design work and move from staging environments ...
- 2614 Views
- 5 replies
- 2 kudos
- 2 kudos
Th best option would be to have them just under git Repo (especially dashboards).
- 2 kudos
- 913 Views
- 2 replies
- 0 kudos
how to save variables in one notebook to be imported into another?
Say, I have a list of values, dictionaries, variable names in `notebook1.ipynb` that I'd like to re-use / import in another `notebook2.ipynb`. For example, in `notebook1.ipynb`, I have the following: var1 = "dallas" var_lst = [100, 200, 300, 400, ...
- 913 Views
- 2 replies
- 0 kudos
- 0 kudos
You can use %run ./notebook2 after defining variables in notebook1So notebook2 will use the variables defined in notebook1
- 0 kudos
- 1549 Views
- 1 replies
- 0 kudos
Can Error Message be un Redacted
I there a way to un-redact the logging of error message ?Alternatively would be nice to have access to the source code of involved classes like : com.databricks.backend.common.util.CommandLineHelper or com.databricks.util.UntrustedUtils I'm getting t...
- 1549 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @floringrigoriu , you can reach out to Databricks support by filing a support ticket.
- 0 kudos
- 1204 Views
- 1 replies
- 0 kudos
Retrieve DBU per query executed
Hello experts,Do you know how we can retrieve the DBUs consumed for a specific query?Thank you
- 1204 Views
- 1 replies
- 0 kudos
- 0 kudos
I couldn't find a metadata table. However the workaround is to multiply the DBU of the current cluster (retrieve it either online or to be more accurate from the compute page at the right) and multiply it with the time in minutes that the query took ...
- 0 kudos
- 1544 Views
- 2 replies
- 1 kudos
running notebook job from remote github repository fails, but do not fail on python script type
Hi allI am trying to run a notebook from a remote repository, but the job fails. I setup the job as follows:my project structure is as such:but the output i get is like such: The thing is if i set the job type to "Python Script" i dont encounter this...
- 1544 Views
- 2 replies
- 1 kudos
- 1 kudos
@NielsMH if you want to run your jobs based o job name, please use new preview service that databricks released which are DAB format. there you can run your job based on your job name.remote repo in the sense, are you using github actions or api, loo...
- 1 kudos
- 961 Views
- 2 replies
- 0 kudos
Query Load/ Warehouse Load missing
For observability, warehouse load and query load percentage are two major requirements. How do I fetch those details?
- 961 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @FatemaMalu, You can fetch the warehouse load and query load percentage using the Databricks REST API. Specifically, you can use the Clusters API to get the status of your SQL Warehouses. https://docs.databricks.com/api/workspace/warehouses
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »