- 1903 Views
- 0 replies
- 0 kudos
Hi Team ,Unity catalog is not enabled in our workspace, We would like to know the billing usage information per user ,could you please help us how to get these details( by using notebook level script).Regards,Phanindra
- 1903 Views
- 0 replies
- 0 kudos
- 10469 Views
- 1 replies
- 0 kudos
Hi all,tl;dr I ran the following on a docker-backed personal compute instance (running 13.3-LTS)```%sqlUSE CATALOG hail;USE SCHEMA volumes_testing;CREATE VOLUME 1kg COMMENT 'Testing 1000 Genomes volume';```But this gives```ParseException: [UC_VOLU...
- 10469 Views
- 1 replies
- 0 kudos
Latest Reply
Resolved with the setting "spark.databricks.unityCatalog.volumes.enabled" = "true"
by
nyck33
• New Contributor II
- 3148 Views
- 1 replies
- 0 kudos
I just emailed the onboarding-help email account to ask for an extension for 2 weeks as I want to complete the Data Engineer course to prepare for my new position. I have 2 accounts where the trial expired, one community account which cannot be used ...
- 3148 Views
- 1 replies
- 0 kudos
Latest Reply
is what happened when trying to sign up with another email.
- 5398 Views
- 1 replies
- 0 kudos
Source data looks like: {
"IntegrityLevel": "16384",
"ParentProcessId": "10972929104936",
"SourceProcessId": "10972929104936",
"SHA256Hash": "a26a1ffb81a61281ffa55cb7778cc3fb0ff981704de49f75f51f18b283fba7a2",
"ImageFileName": "\\Device\\Harddisk...
- 5398 Views
- 1 replies
- 0 kudos
Latest Reply
Thanks for confirming that the readStream.withColumn() approach is the best available option. Unfortunately, this will force me to maintain a separate notebook for each of the event types, but it does work. I was hoping to create just one paramet...
- 5679 Views
- 1 replies
- 0 kudos
from pyspark.sql import SparkSessionfrom pyspark import SparkContext, SparkConffrom pyspark.storagelevel import StorageLevelspark = SparkSession.builder.appName('TEST').config('spark.ui.port','4098').enableHiveSupport().getOrCreate()df4 = spark.sql('...
- 5679 Views
- 1 replies
- 0 kudos
Latest Reply
Thank you so much for taking time and explaining the concepts
- 4362 Views
- 2 replies
- 2 kudos
I have a semicolon separated file in an ADLS container that's been added to Unity Catalog as an External location.When I run the following code on an all-purpose cluster, it runs ok and displays the schema.import dlt
@dlt.table
def test_data_csv():
...
- 4362 Views
- 2 replies
- 2 kudos
Latest Reply
@Retired_mod can you confirm that .option("delimiter", ";") is ignored when run in a DLT pipeline? (please see the post above) My colleage confirmed the behavior.
1 More Replies
- 1163 Views
- 0 replies
- 0 kudos
For my exam i have to do a small project for the company im interning at. I am creating a datawarehouse where i will have to transfer data from another database, and then transforming it to a star schema. would databricks be good for this, or is it t...
- 1163 Views
- 0 replies
- 0 kudos
- 6279 Views
- 1 replies
- 2 kudos
I'm getting the following error: module.consumer_stage_catalog.databricks_external_location.catalog: Creating...
â•·
│ Error: cannot create external location: AWS IAM role does not have READ permissions on url s3://[bucket name]/catalogs. Please conta...
- 6279 Views
- 1 replies
- 2 kudos
- 5636 Views
- 1 replies
- 0 kudos
Is it possible to pass a parameter to a SQL UDF to another SQL UDF that is called by the first SQL UDF?Below is an example where I would like to call tbl_filter() from tbl_func() by passing the tbl_func.a_val parameter to tbl_filter(). Obviously, I c...
- 5636 Views
- 1 replies
- 0 kudos
- 2609 Views
- 0 replies
- 0 kudos
At Inspired Elements, we redefine living spaces in London, offering bespoke fitted wardrobes and fitted kitchens that seamlessly blend functionality with exquisite design. Our commitment to innovation and quality ensures every piece is a work of art,...
- 2609 Views
- 0 replies
- 0 kudos
by
elgeo
• Valued Contributor II
- 8432 Views
- 1 replies
- 1 kudos
Hello. Do you know if you can add columns at a specific position (before / after a column) by altering a delta table ?
- 8432 Views
- 1 replies
- 1 kudos
Latest Reply
yes, using the FIRST or AFTER parameter.https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-ddl-alter-table-manage-column.html#add-column
by
memo
• New Contributor II
- 9960 Views
- 1 replies
- 0 kudos
I want to pass multiple column as argument to pivot a dataframe in pyspark pivot likemydf.groupBy("id").pivot("day","city").agg(F.sum("price").alias("price"),F.sum("units").alias("units")).show(). One way I found is to create multiple df with differ...
- 9960 Views
- 1 replies
- 0 kudos
- 9275 Views
- 4 replies
- 4 kudos
We are currently upgrading our Lakehouse to use the Unity Catalog benefits. We will mostly use external tables because alle our DETLA tables are already stored in Azure Storage. I try to figure out how to update the table property "delta.lastUpdateve...
- 9275 Views
- 4 replies
- 4 kudos
Latest Reply
I am in the same boat.That is the reason I opted to use managed tables instead. OK; it means migrating tables and changing notebooks but besides not having to struggle with external tables, you also get something in return (liquid clustering f.e.).
3 More Replies