cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Constantine
by Contributor III
  • 6797 Views
  • 2 replies
  • 4 kudos

Resolved! How does merge schema work

Let's say I create a table like CREATE TABLE IF NOT EXISTS new_db.data_table ( key STRING, value STRING, last_updated_time TIMESTAMP ) USING DELTA LOCATION 's3://......';Now when I insert into this table I insert data which has say 20 columns a...

  • 6797 Views
  • 2 replies
  • 4 kudos
Latest Reply
timdriscoll22
New Contributor II
  • 4 kudos

I tried running "REFRESH TABLE tablename;" but I still do not see the added columns in the data explorer columns, while I do see the added columns in the sample data 

  • 4 kudos
1 More Replies
data_explorer
by New Contributor II
  • 1137 Views
  • 1 replies
  • 0 kudos

Is there anyway to execute grant and revoke statements to a user for an object based on a condition?

SELECT if((select count(*) from information_schema.table_privileges where grantee = 'samo@test.com' and table_schema='demo_schema' and table_catalog='demo_catalog')==1, (select count(*) from demo_catalog.demo_schema.demo_table), (select count(*) from...

  • 1137 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, GRANT and REVOKE are privileges on an securable object to a principal. And a principal is a user, service principal, or group known to the metastore. Principals can be granted privileges and may own securable objects.Also, you can use REVOKE ON S...

  • 0 kudos
Ovi
by New Contributor III
  • 2989 Views
  • 4 replies
  • 9 kudos

Construct Dataframe or RDD from S3 bucket with Delta tables

Hi all! I have an S3 bucket with Delta parquet files/folders with different schemas each. I need to create an RDD or DataFrame from all those Delta Tables that should contain the path, name and different schema of each.How could I do that?Thank you!P...

  • 2989 Views
  • 4 replies
  • 9 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 9 kudos

You can mount S3 bucket or read directly from it.access_key = dbutils.secrets.get(scope = "aws", key = "aws-access-key") secret_key = dbutils.secrets.get(scope = "aws", key = "aws-secret-key") sc._jsc.hadoopConfiguration().set("fs.s3a.access.key", ac...

  • 9 kudos
3 More Replies
hari
by Contributor
  • 2455 Views
  • 2 replies
  • 5 kudos

Resolved! Best way to automatically update a delta table schema

We have multiple environments where the same tables are added so it's really hard to manually update the schema of the table across all the environments. We know that it's not ideal to update table schema a lot but our product is still evolving and s...

  • 2455 Views
  • 2 replies
  • 5 kudos
Latest Reply
hari
Contributor
  • 5 kudos

Thanks for the reply @Pat Sienkiewicz​ .

  • 5 kudos
1 More Replies
Panna
by New Contributor II
  • 1854 Views
  • 1 replies
  • 3 kudos

Is there only one element type option for an array?

I'm creating an array which contains both string and double, just wondering if I can have multiple element type options for one array column? Thanks

  • 1854 Views
  • 1 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Elements of any type that share a least common type can be used, https://docs.databricks.com/sql/language-manual/functions/array.html#arguments.Please correct me if I misunderstood to understand the requirement.

  • 3 kudos
Labels