- 460 Views
- 1 replies
- 0 kudos
Update on CTE
So I am reflecting a business logic from on prem to azure databricks . what on prem did is created the table and after that updated . I have to construct that as a single query . Example Create or replace table table1with CTE 1 as () ,CTE 2 as (selec...
- 460 Views
- 1 replies
- 0 kudos
- 0 kudos
An actual "Update", it may not be possible, but have you consider and will something like this work for you? This is simulating updates within the query without actual UPDATE statements: CREATE OR REPLACE TABLE table1 AS WITH CTE1 AS ( -- Your in...
- 0 kudos
- 2241 Views
- 5 replies
- 4 kudos
Impersonating a user
How do I impersonate a user? I can't find any documentation that explains how to do this or even hint that it's possible.Use case: I perform administrative tasks like assign grants and roles to catalogs, schemas, and tables for the benefit of busines...
- 2241 Views
- 5 replies
- 4 kudos
- 4 kudos
DB-I-8117 this one is mentioned to be considered for future so adding votes for sure will help.
- 4 kudos
- 1262 Views
- 2 replies
- 0 kudos
Databricks grant update calatog catlog_name --json @privileges.json not updating privileges
Hi Team, I am trying to update the catalog permission privileges using databricks cli command Grant by appending json file but which is not updating the prIviliges, please help on grant update command usage.Command using : databricks grants update c...
- 1262 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @Prasad_Koneru If the command is not updating the privileges as expected, there could be a few reasons for this. Firstly, ensure that the JSON file is correctly formatted and contains the correct privilege assignments. The privileges.json fi...
- 0 kudos
- 3279 Views
- 3 replies
- 3 kudos
Resolved! Find and replace
Hi,Is there a "Find and replace" option to edit SQL code? I am not referring to the "replace" function but something similar to Control + shift + F in Snowflake or Control + F in MS Excel.
- 3279 Views
- 3 replies
- 3 kudos
- 3 kudos
is there an option to find-replace just within a cell instead of entire notebook?
- 3 kudos
- 3549 Views
- 5 replies
- 1 kudos
How to get data from Splunk on daily basis?
I am finding the ways to get the data to Databricks from Splunk (similar to other data sources like S3, Kafka, etc.,). I have received a suggestion to use the Databricks add-on to get/put the data from/to Splunk. To pull the data from Databricks to S...
- 3549 Views
- 5 replies
- 1 kudos
- 1 kudos
@Arch_dbxlearner - could you please follow the post for more details. https://community.databricks.com/t5/data-engineering/does-databricks-integrate-with-splunk-what-are-some-ways-to-send/td-p/22048
- 1 kudos
- 380 Views
- 1 replies
- 0 kudos
duplicate data published in kafka offset
we have 25k data which are publishing by batch of 5k.we are numbering the records based on row_number window function and creating batch using this.we have observed that some records like 10-20 records are getting published duplicated in 2 offset. ca...
- 380 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dipali_globant,duplicate data in Kafka can arise in a batch processing scenario for a few reasons here’s an example of ensuring unique and consistent row numbering: from pyspark.sql import Window from pyspark.sql.functions import row_number wind...
- 0 kudos
- 330 Views
- 1 replies
- 0 kudos
How do i find total number of input tokens to genie ?
I am calculating usage analytics for my work, where they use genie.I have given the following for my genie as definition:(1) instructions (2) example SQL queries (3) Within catalog, i went to those relevant table schema and added comments, descriptio...
- 330 Views
- 1 replies
- 0 kudos
- 0 kudos
Or is there any set of tables and functions to determine the number of input and output tokens per query?
- 0 kudos
- 7079 Views
- 1 replies
- 0 kudos
how to use R in databricks
Hello everyone.I am a new user of databricks, they implemented it in the company where I work. I am a business analyst and I know something about R, not much either, when I saw that databricks could use R I was very excited because I thought that the...
- 7079 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello JCamiloCS, did you get out of it?We have had the same question, so I just wonder if you find any good guidance?
- 0 kudos
- 922 Views
- 7 replies
- 0 kudos
Cluster logs folder
Hi, I can't see to find the cluster_logs folder, anyone that can help me find where the cluster logs are stored? Best regards
- 922 Views
- 7 replies
- 0 kudos
- 0 kudos
Thank you for the help! I have enabled predictive optimization for unity catalog, thinking it would automatically preform VACCUM on the tables i have in my delta lake. With that in mind, I assumed VACCUM wouldn't require further attention.Would it be...
- 0 kudos
- 3300 Views
- 1 replies
- 1 kudos
Resolved! requirements.txt with cluster libraries
Cluster libraries are supported from version 15.0 - Databricks Runtime 15.0 | Databricks on AWS.How can I specify requirements.txt file path in the libraries in a job cluster in my workflow? Can I use relative path? Is it relative from the root of th...
- 3300 Views
- 1 replies
- 1 kudos
- 1 kudos
To use the new "requirements.txt" feature in your cluster do the following:Change your cluster's "Databricks Runtime Version" to 15.0 or greater (example: "15.4 LTS ML (includes Apache Spark 3.5.0, Scala 2.12)"). Navigate to the cluster's: "Libraries...
- 1 kudos
- 1018 Views
- 1 replies
- 0 kudos
Can we share Delta table data with Salesforce using OData?
Hello!I'm seeking recommendations for streaming on-demand data from Databricks Delta tables to Salesforce. Is OData a viable choice?Thanks.
- 1018 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @ChristopherQ1, Salesforce has released a zero-copy connection that relies on the SQL Warehouse to ingest data when needed. I suggest you consider that instead of OData. Matthew
- 0 kudos
- 211 Views
- 0 replies
- 0 kudos
Paralellizing XGBoost Hyperopt run using Databricks
Hi there!I am implementing a classifier for classifying documents to their respective healthcare type.My current setup implements the regular XGBClassifier of which the hyperparameters are to be tuned on my dataset, which is done using Hyperopt. Base...
- 211 Views
- 0 replies
- 0 kudos
- 1931 Views
- 5 replies
- 1 kudos
Variables in databricks.yml "include:" - Asset Bundles
HI,We've got an app that we deploy to multiple customers workspaces. We're looking to transition to asset bundles. We would like to structure our resources like: -src/ -resources/ |-- customer_1/ |-- job_1 |-- job_2 |-- customer_2/ |-- job_...
- 1931 Views
- 5 replies
- 1 kudos
- 1 kudos
I have a similar use case. We have two different host for databricks, EU and NA. In some case we need to deploy a similar job in both hosts. To fix that, here how I did:- Into job folder I created different job files, each one for one host. In aditio...
- 1 kudos
- 658 Views
- 1 replies
- 0 kudos
Databricks Apps is now available in Public Preview
Databricks Apps is a new way to build and deploy internal data and AI applications is now available in Public Preview.Databricks Apps let developers build native apps using frameworks like Dash, Shiny and Streamlit, enabling data applications for non...
- 658 Views
- 1 replies
- 0 kudos
- 566 Views
- 3 replies
- 0 kudos
Embed Dashboard - GraphQL Operation Not Authentic
I have added a domain to my list of approved domains for embedding dashboards from my Databricks instance. This domain hosts my Docusaurus site. When the page with the embedded dashboard loads, it makes some network requests to Databricks that are fa...
- 566 Views
- 3 replies
- 0 kudos
- 0 kudos
is it possible that this is happening because the website is not HTTPS?
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »