- 2510 Views
- 3 replies
- 2 kudos
Resolved! Find and replace
Hi,Is there a "Find and replace" option to edit SQL code? I am not referring to the "replace" function but something similar to Control + shift + F in Snowflake or Control + F in MS Excel.
- 2510 Views
- 3 replies
- 2 kudos
- 2 kudos
is there an option to find-replace just within a cell instead of entire notebook?
- 2 kudos
- 3266 Views
- 5 replies
- 1 kudos
How to get data from Splunk on daily basis?
I am finding the ways to get the data to Databricks from Splunk (similar to other data sources like S3, Kafka, etc.,). I have received a suggestion to use the Databricks add-on to get/put the data from/to Splunk. To pull the data from Databricks to S...
- 3266 Views
- 5 replies
- 1 kudos
- 1 kudos
@Arch_dbxlearner - could you please follow the post for more details. https://community.databricks.com/t5/data-engineering/does-databricks-integrate-with-splunk-what-are-some-ways-to-send/td-p/22048
- 1 kudos
- 294 Views
- 1 replies
- 0 kudos
duplicate data published in kafka offset
we have 25k data which are publishing by batch of 5k.we are numbering the records based on row_number window function and creating batch using this.we have observed that some records like 10-20 records are getting published duplicated in 2 offset. ca...
- 294 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dipali_globant,duplicate data in Kafka can arise in a batch processing scenario for a few reasons here’s an example of ensuring unique and consistent row numbering: from pyspark.sql import Window from pyspark.sql.functions import row_number wind...
- 0 kudos
- 297 Views
- 1 replies
- 0 kudos
How do i find total number of input tokens to genie ?
I am calculating usage analytics for my work, where they use genie.I have given the following for my genie as definition:(1) instructions (2) example SQL queries (3) Within catalog, i went to those relevant table schema and added comments, descriptio...
- 297 Views
- 1 replies
- 0 kudos
- 0 kudos
Or is there any set of tables and functions to determine the number of input and output tokens per query?
- 0 kudos
- 6666 Views
- 1 replies
- 0 kudos
how to use R in databricks
Hello everyone.I am a new user of databricks, they implemented it in the company where I work. I am a business analyst and I know something about R, not much either, when I saw that databricks could use R I was very excited because I thought that the...
- 6666 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello JCamiloCS, did you get out of it?We have had the same question, so I just wonder if you find any good guidance?
- 0 kudos
- 794 Views
- 7 replies
- 0 kudos
Cluster logs folder
Hi, I can't see to find the cluster_logs folder, anyone that can help me find where the cluster logs are stored? Best regards
- 794 Views
- 7 replies
- 0 kudos
- 0 kudos
Thank you for the help! I have enabled predictive optimization for unity catalog, thinking it would automatically preform VACCUM on the tables i have in my delta lake. With that in mind, I assumed VACCUM wouldn't require further attention.Would it be...
- 0 kudos
- 2789 Views
- 1 replies
- 1 kudos
Resolved! requirements.txt with cluster libraries
Cluster libraries are supported from version 15.0 - Databricks Runtime 15.0 | Databricks on AWS.How can I specify requirements.txt file path in the libraries in a job cluster in my workflow? Can I use relative path? Is it relative from the root of th...
- 2789 Views
- 1 replies
- 1 kudos
- 1 kudos
To use the new "requirements.txt" feature in your cluster do the following:Change your cluster's "Databricks Runtime Version" to 15.0 or greater (example: "15.4 LTS ML (includes Apache Spark 3.5.0, Scala 2.12)"). Navigate to the cluster's: "Libraries...
- 1 kudos
- 918 Views
- 1 replies
- 0 kudos
Can we share Delta table data with Salesforce using OData?
Hello!I'm seeking recommendations for streaming on-demand data from Databricks Delta tables to Salesforce. Is OData a viable choice?Thanks.
- 918 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @ChristopherQ1, Salesforce has released a zero-copy connection that relies on the SQL Warehouse to ingest data when needed. I suggest you consider that instead of OData. Matthew
- 0 kudos
- 171 Views
- 0 replies
- 0 kudos
Paralellizing XGBoost Hyperopt run using Databricks
Hi there!I am implementing a classifier for classifying documents to their respective healthcare type.My current setup implements the regular XGBClassifier of which the hyperparameters are to be tuned on my dataset, which is done using Hyperopt. Base...
- 171 Views
- 0 replies
- 0 kudos
- 1602 Views
- 5 replies
- 1 kudos
Variables in databricks.yml "include:" - Asset Bundles
HI,We've got an app that we deploy to multiple customers workspaces. We're looking to transition to asset bundles. We would like to structure our resources like: -src/ -resources/ |-- customer_1/ |-- job_1 |-- job_2 |-- customer_2/ |-- job_...
- 1602 Views
- 5 replies
- 1 kudos
- 1 kudos
I have a similar use case. We have two different host for databricks, EU and NA. In some case we need to deploy a similar job in both hosts. To fix that, here how I did:- Into job folder I created different job files, each one for one host. In aditio...
- 1 kudos
- 493 Views
- 1 replies
- 0 kudos
Databricks Apps is now available in Public Preview
Databricks Apps is a new way to build and deploy internal data and AI applications is now available in Public Preview.Databricks Apps let developers build native apps using frameworks like Dash, Shiny and Streamlit, enabling data applications for non...
- 493 Views
- 1 replies
- 0 kudos
- 486 Views
- 3 replies
- 0 kudos
Embed Dashboard - GraphQL Operation Not Authentic
I have added a domain to my list of approved domains for embedding dashboards from my Databricks instance. This domain hosts my Docusaurus site. When the page with the embedded dashboard loads, it makes some network requests to Databricks that are fa...
- 486 Views
- 3 replies
- 0 kudos
- 0 kudos
is it possible that this is happening because the website is not HTTPS?
- 0 kudos
- 13274 Views
- 4 replies
- 3 kudos
Permissions on Unity Catalog Table Constraints
Hi all.I've used new options to add constraints to UC tablesEven granting permissions to an user (ALL PRIVILEGES) on particular schema we have errors when trying to add PKs. The message doesn't make sense (PERMISSION_DENIED: User is not an owner of T...
- 13274 Views
- 4 replies
- 3 kudos
- 3 kudos
So how does one grant these permissions to non-owners?
- 3 kudos
- 883 Views
- 4 replies
- 0 kudos
Unity Catalog
I want to create Unity Catalog but can't find option "Manage Account" in drop down. I am using Azure one month free trial subscription. Is it because of free trial account?
- 883 Views
- 4 replies
- 0 kudos
- 0 kudos
If you are an Account admin, you should see it under the workspace name. Please check whether you logged in as Account Admin.
- 0 kudos
- 506 Views
- 1 replies
- 0 kudos
Connection between Azure Databricks workspace and Microsoft Power BI on Remote Server ( Jump Box)
Both (Azure Databricks workspace & Microsoft Power) resided on Jump box server (Remote Environment) Azure Databricks connector is not visible their, the connection is possible through Third Party tool but not directly with the built in connector, a...
- 506 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Murtaza-007-007 ,You can take a look at below thread. In my company it was due to firewall rules blocking some dns required for connector.https://community.fabric.microsoft.com/t5/Desktop/PowerBi-Desktop-App-missing-databricks-connector/td-p/3474...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »