- 3183 Views
- 4 replies
- 0 kudos
Cluster Access mode set to Shared on Databricks, results in connection refused on Exasol
I am trying to run a TRUNCATE command on my exasol DWH from Databricks using the pyexasol. This works perfectly fine when I have the cluster access mode as "No Isolation Shared" which does not have access to our Unity Catalog. When I change the clust...
- 3183 Views
- 4 replies
- 0 kudos
- 0 kudos
Interesting. Did you try with "Single User" mode, which also has UC support?
- 0 kudos
- 1842 Views
- 2 replies
- 0 kudos
Save output of show table extended to table?
I want to save the output of show table extended in catalogName like 'mysearchtext*';to a table.How do I do that?
- 1842 Views
- 2 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 30204 Views
- 3 replies
- 7 kudos
Introducing the Data Intelligence Platforms
Introducing the Data Intelligence Platform, our latest AI-driven data platform constructed on a lakehouse architecture. It’s not just an incremental improvement over current data platforms, but a fundamental shift in product strategy and roadmap. E...
- 30204 Views
- 3 replies
- 7 kudos
- 7 kudos
Hmm I preferred naming related to water like data lake, delta lake and lakehouse
- 7 kudos
- 2156 Views
- 1 replies
- 0 kudos
java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManager.<init>(Lcom/amazonaw
- 2156 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 1272 Views
- 0 replies
- 0 kudos
MissingCredentialScopeException when writing hail matrix table to unity volumes
Hello,I am tinkering with using Unity Catalog Volumes with hailI tried the following hl_1000g.write( str(VOLUMES_PATH / '1kg.mt'), overwrite=True )Where 'VOLUMES_PATH' is `Path("/Volumes") / "hail" / "volumes_testing" / "1kg"`Unfortunately ...
- 1272 Views
- 0 replies
- 0 kudos
- 3303 Views
- 1 replies
- 0 kudos
Azure Databricks Notebook Sharing, Notebook Exporting and Notebook Clipboard copy download
Hello,I would like to know in which scenario Azure Databricks User would be able to download Notebook Command Output if Notebook Result Download is disabled. Do we know if Privilege user would be able to share sensitive information with non-privilege...
- 3303 Views
- 1 replies
- 0 kudos
- 0 kudos
Thank you Kaniz. Can we disable the exporting of notebook except Source File? If yes, then how do we achieve is?Also, we do not want to share the notebook which has any kind of notebook results, can we use spark.databricks.query.displayMaxRows and se...
- 0 kudos
- 2114 Views
- 1 replies
- 0 kudos
Resolved! 'Unity Catalog Volumes is not enabled on this instance' error
Hi all,tl;dr I ran the following on a docker-backed personal compute instance (running 13.3-LTS)```%sqlUSE CATALOG hail;USE SCHEMA volumes_testing;CREATE VOLUME 1kg COMMENT 'Testing 1000 Genomes volume';```But this gives```ParseException: [UC_VOLU...
- 2114 Views
- 1 replies
- 0 kudos
- 0 kudos
Resolved with the setting "spark.databricks.unityCatalog.volumes.enabled" = "true"
- 0 kudos
- 1660 Views
- 1 replies
- 0 kudos
Databricks learning festival, but my trial is over
I just emailed the onboarding-help email account to ask for an extension for 2 weeks as I want to complete the Data Engineer course to prepare for my new position. I have 2 accounts where the trial expired, one community account which cannot be used ...
- 1660 Views
- 1 replies
- 0 kudos
- 0 kudos
is what happened when trying to sign up with another email.
- 0 kudos
- 1759 Views
- 1 replies
- 1 kudos
More than expected number of Jobs created in Databricks
Hi Databricks Gurus !I am trying to run a very simple snippet :data_emp=[["1","sarvan","1"],["2","John","2"],["3","Jose","1"]]emp_columns=["EmpId","Name","Dept"]df=spark.createDataFrame(data=data_emp, schema=emp_columns)df.show() --------Based on a g...
- 1759 Views
- 1 replies
- 1 kudos
- 1 kudos
- 1 kudos
- 2957 Views
- 1 replies
- 0 kudos
Resolved! Convert multiple string fields to int or long during streaming
Source data looks like: { "IntegrityLevel": "16384", "ParentProcessId": "10972929104936", "SourceProcessId": "10972929104936", "SHA256Hash": "a26a1ffb81a61281ffa55cb7778cc3fb0ff981704de49f75f51f18b283fba7a2", "ImageFileName": "\\Device\\Harddisk...
- 2957 Views
- 1 replies
- 0 kudos
- 0 kudos
Thanks for confirming that the readStream.withColumn() approach is the best available option. Unfortunately, this will force me to maintain a separate notebook for each of the event types, but it does work. I was hoping to create just one paramet...
- 0 kudos
- 2582 Views
- 2 replies
- 1 kudos
Bug report: the delimiter option does not work when run on DLT
I have a semicolon separated file in an ADLS container that's been added to Unity Catalog as an External location.When I run the following code on an all-purpose cluster, it runs ok and displays the schema.import dlt @dlt.table def test_data_csv(): ...
- 2582 Views
- 2 replies
- 1 kudos
- 1 kudos
@Retired_mod can you confirm that .option("delimiter", ";") is ignored when run in a DLT pipeline? (please see the post above) My colleage confirmed the behavior.
- 1 kudos
- 4363 Views
- 1 replies
- 1 kudos
Get exceptionTraceId details
I'm getting the following error: module.consumer_stage_catalog.databricks_external_location.catalog: Creating... ╷ │ Error: cannot create external location: AWS IAM role does not have READ permissions on url s3://[bucket name]/catalogs. Please conta...
- 4363 Views
- 1 replies
- 1 kudos
- 1 kudos
- 1 kudos
- 3364 Views
- 1 replies
- 0 kudos
Resolved! Can we pass parameters thru SQL UDF's?
Is it possible to pass a parameter to a SQL UDF to another SQL UDF that is called by the first SQL UDF?Below is an example where I would like to call tbl_filter() from tbl_func() by passing the tbl_func.a_val parameter to tbl_filter(). Obviously, I c...
- 3364 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 1833 Views
- 0 replies
- 0 kudos
company profile
At Inspired Elements, we redefine living spaces in London, offering bespoke fitted wardrobes and fitted kitchens that seamlessly blend functionality with exquisite design. Our commitment to innovation and quality ensures every piece is a work of art,...
- 1833 Views
- 0 replies
- 0 kudos
- 4715 Views
- 1 replies
- 1 kudos
Resolved! Αdd columns delta table
Hello. Do you know if you can add columns at a specific position (before / after a column) by altering a delta table ?
- 4715 Views
- 1 replies
- 1 kudos
- 1 kudos
yes, using the FIRST or AFTER parameter.https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-ddl-alter-table-manage-column.html#add-column
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »