cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Constantine
by Contributor III
  • 5857 Views
  • 2 replies
  • 4 kudos

Resolved! How does merge schema work

Let's say I create a table like CREATE TABLE IF NOT EXISTS new_db.data_table ( key STRING, value STRING, last_updated_time TIMESTAMP ) USING DELTA LOCATION 's3://......';Now when I insert into this table I insert data which has say 20 columns a...

  • 5857 Views
  • 2 replies
  • 4 kudos
Latest Reply
timdriscoll22
New Contributor II
  • 4 kudos

I tried running "REFRESH TABLE tablename;" but I still do not see the added columns in the data explorer columns, while I do see the added columns in the sample data 

  • 4 kudos
1 More Replies
data_explorer
by New Contributor II
  • 932 Views
  • 1 replies
  • 0 kudos

Is there anyway to execute grant and revoke statements to a user for an object based on a condition?

SELECT if((select count(*) from information_schema.table_privileges where grantee = 'samo@test.com' and table_schema='demo_schema' and table_catalog='demo_catalog')==1, (select count(*) from demo_catalog.demo_schema.demo_table), (select count(*) from...

  • 932 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, GRANT and REVOKE are privileges on an securable object to a principal. And a principal is a user, service principal, or group known to the metastore. Principals can be granted privileges and may own securable objects.Also, you can use REVOKE ON S...

  • 0 kudos
Ovi
by New Contributor III
  • 2459 Views
  • 5 replies
  • 10 kudos

Construct Dataframe or RDD from S3 bucket with Delta tables

Hi all! I have an S3 bucket with Delta parquet files/folders with different schemas each. I need to create an RDD or DataFrame from all those Delta Tables that should contain the path, name and different schema of each.How could I do that?Thank you!P...

  • 2459 Views
  • 5 replies
  • 10 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 10 kudos

You can mount S3 bucket or read directly from it.access_key = dbutils.secrets.get(scope = "aws", key = "aws-access-key") secret_key = dbutils.secrets.get(scope = "aws", key = "aws-secret-key") sc._jsc.hadoopConfiguration().set("fs.s3a.access.key", ac...

  • 10 kudos
4 More Replies
hari
by Contributor
  • 2001 Views
  • 2 replies
  • 5 kudos

Resolved! Best way to automatically update a delta table schema

We have multiple environments where the same tables are added so it's really hard to manually update the schema of the table across all the environments. We know that it's not ideal to update table schema a lot but our product is still evolving and s...

  • 2001 Views
  • 2 replies
  • 5 kudos
Latest Reply
hari
Contributor
  • 5 kudos

Thanks for the reply @Pat Sienkiewicz​ .

  • 5 kudos
1 More Replies
Panna
by New Contributor II
  • 1637 Views
  • 2 replies
  • 3 kudos

Is there only one element type option for an array?

I'm creating an array which contains both string and double, just wondering if I can have multiple element type options for one array column? Thanks

  • 1637 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @Panna Pan​ , We haven’t heard from you on the last response from @Debayan Mukherjee​, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to...

  • 3 kudos
1 More Replies
Labels