- 243 Views
- 1 replies
- 0 kudos
GCP Databricks Spark Connector for Cassandra - Error: com.typesafe.config.impl.ConfigImpl.newSimple
Hello,I am using Databricks runtime 12.2 with the spark connector - com.datastax.spark:spark-cassandra-connector_2.12:3.3.0as runtime 12.2 comes with spark 3.3.2 and scala 2.12. I encounter an issue with conneciting to cassandra DB using the below co...
- 243 Views
- 1 replies
- 0 kudos
- 0 kudos
Try using the assembly version of the jar with 12.2. https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector-assembly If this doesn't work, please paste the full, original stacktrace
- 0 kudos
- 1072 Views
- 6 replies
- 0 kudos
Resolved! Is it possible to obtain a job's event log via the REST API?
Currently, to investigate job performance, I can look at a job's information (via the UI) to see the "Event Log" (pictured below):I'd like to obtain this information programmatically, so I can analyze it across jobs. However, the docs for the `get` c...
- 1072 Views
- 6 replies
- 0 kudos
- 0 kudos
I also see there is a "list cluster events" API (https://docs.databricks.com/api/workspace/clusters/events); can I get the event log this way?
- 0 kudos
- 3683 Views
- 2 replies
- 1 kudos
Resolved! How are Struct type columns stored/accessed (interested in efficiency)?
Hello, I've searched around for awhile and didn't find a similar question here or elsewhere, so thought I'd ask...I'm assessing the storage/access efficiency of Struct type columns in delta tables. I want to know more about how Databricks is storing...
- 3683 Views
- 2 replies
- 1 kudos
- 1 kudos
Thank you very much for the thoughful response. Please excuse my belated feedback and thanks!
- 1 kudos
- 423 Views
- 3 replies
- 0 kudos
Databricks Clean Rooms with 3 or more collaborators
Let's say I create a clean room with 2 other collaborators, call them collaborator A and collaborator B (so 3 in total, including me) and then shared some tables to the clean room. If collaborator A writes code that does a "SELECT * FROM creator.<tab...
- 423 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @pardeep7 , As per my understanding, all participants of clean room can only see metadata. The raw data in your tables is not directly accessed by other collaborators.Any output tables created by Collaborators based on the queries/notebooks will b...
- 0 kudos
- 313 Views
- 1 replies
- 0 kudos
Databricks System Table access to group
Can we give system Table USE and SELECT permission to Group (AD group) instead or individual user ?
- 313 Views
- 1 replies
- 0 kudos
- 484 Views
- 2 replies
- 1 kudos
Resolved! Connect databricks community edition to datalake s3/adls2
Can anybody know how can i connect with aws s3 object storage with databricks community edition or can i connect with community databricks account or not ?
- 484 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @harsh_Dev ,You can read from/write to AWS S3 with Databricks Community edition. As you will not be able to use instance profiles, you will need to configure the AWS credentials manually and access S3 using S3 URI. Try below code spark._jsc.hadoop...
- 1 kudos
- 262 Views
- 1 replies
- 1 kudos
Required versus current compute setup
To run demo and lab notebooks, I am required to have the following Databricks runtime(s): 15.4.x-cpu-ml-scala2.12 but the compute in my setup is of the following runtime version, will that be an issue? 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.1...
- 262 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @AGnewbie, Firstly, regarding the Databricks runtime: your compute setup is currently running version 11.3 LTS, which will indeed be an issue as the specified version is not present in your current runtime. Hence, you need to update your runtim...
- 1 kudos
- 747 Views
- 1 replies
- 0 kudos
Decimal(32,6) datatype in Databricks - precision roundoff
Hello All,I need your assistance. I recently started a migration project from Synapse Analytics to Databricks. While dealing with the datatypes, I came across a situation where in Dedicated Sql Pool the value is 0.033882, but in DataBricks the value ...
- 747 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Boyeenas ,I believe your assumption is correct. Databricks is built on Apache Spark and the system applies rounding automatically based on the value of the subsequent digit. In your case, if the original value had a 7th decimal digit of 5 or high...
- 0 kudos
- 1141 Views
- 0 replies
- 0 kudos
How to resolve "cannot import name 'Iterable' from 'collections'" error?
I'm running a DBR/Spark job using a container. I've set docker_image.url to `docker.io/databricksruntime/standard:13.3-LTS`, as well as the Spark env var `DATABRICKS_RUNTIME_VERSION=13.3`. At runtime, however, I'm encountering this error: ImportError...
- 1141 Views
- 0 replies
- 0 kudos
- 238 Views
- 1 replies
- 0 kudos
job "run name" in "system" "lake flow" "job run timeline" table
For few jobs in unity catalog the "run name" is coming out to be "null" whereas for few we the complete name with system generated batch id. I am not sure how this field is populated and why for some job's the "run name" is present whereas for some i...
- 238 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @mishrarit! Run name in Unity Catalog job runs is determined by how the job is triggered. For manual runs, Databricks automatically generates a name, and for scheduled or API-triggered runs, the run name remains null unless explicitly defined.
- 0 kudos
- 995 Views
- 2 replies
- 0 kudos
Set up compute policy to allow installing python libraries from a private package index
In our organization, we maintain a bunch of libraries we share code with. They're hosted on a private python package index, which requires a token to allow downloads. My idea was to store the token as a secret which would then be loaded into a cluste...
- 995 Views
- 2 replies
- 0 kudos
- 0 kudos
I figured it out, seems like secrets can only be loaded into environment variables if the content is the secret and nothing else:"value": "{{secrets/global/arneCorpPyPI_token}}" # this will work"value": "foo {{secrets/global/arneCorpPyPI_toke...
- 0 kudos
- 455 Views
- 1 replies
- 0 kudos
Creating Unity Catalog in Personal AZURE Portal Account
Seeking advice on the following:1. Given that I have a Personal - and not an Organization-based - AZURE Portal Account, 2. that I can see I am Global Admin and have Admin Role in Databricks, 3. then why can I not get "Manage Account" for a...
- 455 Views
- 1 replies
- 0 kudos
- 0 kudos
@GerardAlexander Try signing in to the Account Console (https://accounts.azuredatabricks.net/login) using a user account with the appropriate permissions, rather than accessing it from the workspace.If you are unable to sign in, the following resourc...
- 0 kudos
- 688 Views
- 0 replies
- 0 kudos
Is there a way to prevent databricks-connect from installing a global IPython Spark startup script?
I'm currently using databricks-connect through VS Code on MacOS. However, this seems to install (and re-install upon deletion) an IPython startup script which initializes a SparkSession. This is fine as far as it goes, except that this script is *glo...
- 688 Views
- 0 replies
- 0 kudos
- 2667 Views
- 3 replies
- 1 kudos
Power BI - Azure Databricks Connector shows Error AAD is not setup for domain
Hi Team,What I would like to do is understand what is required for PowerBI gateway to use single sign-on (AAD) to Databricks. Is that something you could have encountered before and know the fix? I currently get message from Power BI that AAD is not ...
- 2667 Views
- 3 replies
- 1 kudos
- 1 kudos
Hello, did you have any solution for this? I am facing the same issue.
- 1 kudos
- 857 Views
- 0 replies
- 0 kudos
Can AWS workspaces share subnets?
The docs state:"You can choose to share one subnet across multiple workspaces or both subnets across workspaces."as well as:"You can reuse existing security groups rather than create new ones."and on this page:"If you plan to share a VPC and subnets ...
- 857 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
4 -
Azure data disk
1 -
Azure databricks
12 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Group
1 -
Community Members
1 -
Compute
3 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
9 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks notebook
2 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
Google Bigquery
1 -
HIPAA
1 -
Integration
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
MlFlow
2 -
Model Training
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
4 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
2 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
sparkui
2 -
Splunk
1 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
1 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
122 | |
56 | |
40 | |
30 | |
20 |