- 713 Views
- 2 replies
- 0 kudos
databricks asset bundle error-terraform.exe": file does not exist
Hi,I am getting below error while I am deploying databricks bundle using azure devops release 2024-07-07T03:55:51.1199594Z Error: terraform init: exec: "xxxx\\.databricks\\bundle\\dev\\terraform\\xxxx\\.databricks\\bundle\\dev\\bin\\terraform.exe": ...
- 713 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Databricks team, any update on this issue? I'm experiencing the same issue. Our development VDI is isolated so databricks cli cannot download then necessary Terraform files. We are therefore forced to download them, have them installed manually. W...
- 0 kudos
- 2994 Views
- 5 replies
- 1 kudos
How to get data from Splunk on daily basis?
I am finding the ways to get the data to Databricks from Splunk (similar to other data sources like S3, Kafka, etc.,). I have received a suggestion to use the Databricks add-on to get/put the data from/to Splunk. To pull the data from Databricks to S...
- 2994 Views
- 5 replies
- 1 kudos
- 1 kudos
@Arch_dbxlearner - could you please follow the post for more details. https://community.databricks.com/t5/data-engineering/does-databricks-integrate-with-splunk-what-are-some-ways-to-send/td-p/22048
- 1 kudos
- 1248 Views
- 1 replies
- 0 kudos
Foreign table to delta streaming table
I want to copy a table from a foreign catalog as my streaming table. This is the code I used but I am getting error: Table table_name does not support either micro-batch or continuous scan.; spark.readStream .table(table_name) ...
- 1248 Views
- 1 replies
- 0 kudos
- 0 kudos
Bumping this thread because I have the same question and this is still the first result on Google (c. October 2024). Many thanks for anyone who is able to assist!
- 0 kudos
- 233 Views
- 1 replies
- 0 kudos
duplicate data published in kafka offset
we have 25k data which are publishing by batch of 5k.we are numbering the records based on row_number window function and creating batch using this.we have observed that some records like 10-20 records are getting published duplicated in 2 offset. ca...
- 233 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dipali_globant,duplicate data in Kafka can arise in a batch processing scenario for a few reasons here’s an example of ensuring unique and consistent row numbering: from pyspark.sql import Window from pyspark.sql.functions import row_number wind...
- 0 kudos
- 526 Views
- 4 replies
- 1 kudos
Alter table to add/update multiple column comments
I was wondering if there's a way to alter table and add/update comments for multiple columns at once using SQL or API calls. For instance - ALTER TABLE <table_name> CHANGE COLUMN <col1> COMMENT '<comment1>',CHANGE COLUMN <col2> COMMENT 'comment2' ; ...
- 526 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi, assuming you have a dictionary with column name to comment mapping, you can do this using pyspark like this:columns_comments = { "col1": "comment1", "col2": "comment2", # Add all your columns and comments here } for col, comment in c...
- 1 kudos
- 243 Views
- 1 replies
- 0 kudos
How do i find total number of input tokens to genie ?
I am calculating usage analytics for my work, where they use genie.I have given the following for my genie as definition:(1) instructions (2) example SQL queries (3) Within catalog, i went to those relevant table schema and added comments, descriptio...
- 243 Views
- 1 replies
- 0 kudos
- 0 kudos
Or is there any set of tables and functions to determine the number of input and output tokens per query?
- 0 kudos
- 6223 Views
- 1 replies
- 0 kudos
how to use R in databricks
Hello everyone.I am a new user of databricks, they implemented it in the company where I work. I am a business analyst and I know something about R, not much either, when I saw that databricks could use R I was very excited because I thought that the...
- 6223 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello JCamiloCS, did you get out of it?We have had the same question, so I just wonder if you find any good guidance?
- 0 kudos
- 474 Views
- 7 replies
- 0 kudos
Cluster logs folder
Hi, I can't see to find the cluster_logs folder, anyone that can help me find where the cluster logs are stored? Best regards
- 474 Views
- 7 replies
- 0 kudos
- 0 kudos
Thank you for the help! I have enabled predictive optimization for unity catalog, thinking it would automatically preform VACCUM on the tables i have in my delta lake. With that in mind, I assumed VACCUM wouldn't require further attention.Would it be...
- 0 kudos
- 2320 Views
- 1 replies
- 0 kudos
requirements.txt with cluster libraries
Cluster libraries are supported from version 15.0 - Databricks Runtime 15.0 | Databricks on AWS.How can I specify requirements.txt file path in the libraries in a job cluster in my workflow? Can I use relative path? Is it relative from the root of th...
- 2320 Views
- 1 replies
- 0 kudos
- 0 kudos
To use the new "requirements.txt" feature in your cluster do the following:Change your cluster's "Databricks Runtime Version" to 15.0 or greater (example: "15.4 LTS ML (includes Apache Spark 3.5.0, Scala 2.12)"). Navigate to the cluster's: "Libraries...
- 0 kudos
- 848 Views
- 1 replies
- 0 kudos
Can we share Delta table data with Salesforce using OData?
Hello!I'm seeking recommendations for streaming on-demand data from Databricks Delta tables to Salesforce. Is OData a viable choice?Thanks.
- 848 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @ChristopherQ1, Salesforce has released a zero-copy connection that relies on the SQL Warehouse to ingest data when needed. I suggest you consider that instead of OData. Matthew
- 0 kudos
- 131 Views
- 0 replies
- 0 kudos
Paralellizing XGBoost Hyperopt run using Databricks
Hi there!I am implementing a classifier for classifying documents to their respective healthcare type.My current setup implements the regular XGBClassifier of which the hyperparameters are to be tuned on my dataset, which is done using Hyperopt. Base...
- 131 Views
- 0 replies
- 0 kudos
- 1298 Views
- 5 replies
- 1 kudos
Variables in databricks.yml "include:" - Asset Bundles
HI,We've got an app that we deploy to multiple customers workspaces. We're looking to transition to asset bundles. We would like to structure our resources like: -src/ -resources/ |-- customer_1/ |-- job_1 |-- job_2 |-- customer_2/ |-- job_...
- 1298 Views
- 5 replies
- 1 kudos
- 1 kudos
I have a similar use case. We have two different host for databricks, EU and NA. In some case we need to deploy a similar job in both hosts. To fix that, here how I did:- Into job folder I created different job files, each one for one host. In aditio...
- 1 kudos
- 309 Views
- 1 replies
- 0 kudos
Databricks Apps is now available in Public Preview
Databricks Apps is a new way to build and deploy internal data and AI applications is now available in Public Preview.Databricks Apps let developers build native apps using frameworks like Dash, Shiny and Streamlit, enabling data applications for non...
- 309 Views
- 1 replies
- 0 kudos
- 308 Views
- 3 replies
- 0 kudos
Embed Dashboard - GraphQL Operation Not Authentic
I have added a domain to my list of approved domains for embedding dashboards from my Databricks instance. This domain hosts my Docusaurus site. When the page with the embedded dashboard loads, it makes some network requests to Databricks that are fa...
- 308 Views
- 3 replies
- 0 kudos
- 0 kudos
is it possible that this is happening because the website is not HTTPS?
- 0 kudos
- 12270 Views
- 4 replies
- 3 kudos
Permissions on Unity Catalog Table Constraints
Hi all.I've used new options to add constraints to UC tablesEven granting permissions to an user (ALL PRIVILEGES) on particular schema we have errors when trying to add PKs. The message doesn't make sense (PERMISSION_DENIED: User is not an owner of T...
- 12270 Views
- 4 replies
- 3 kudos
- 3 kudos
So how does one grant these permissions to non-owners?
- 3 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »