- 33349 Views
- 4 replies
- 4 kudos
I want to write a SQL query that queries the information_schema to generate a list of objects, their columns, relationships etc. - basically a data dictionary. For each object I want to show the DDL code, and I know I can get it by executing show c...
- 33349 Views
- 4 replies
- 4 kudos
Latest Reply
Hi @Richard Architect I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I sugg...
3 More Replies
- 9440 Views
- 11 replies
- 10 kudos
Hi Everyone,I am getting the following error when running a SQL query and do not understand what it means or what can be done to resolve it. Any recommendations?View DDL:CREATE VIEW myschema.table (
accountId,
agreementType,
capture_file_name,
...
- 9440 Views
- 11 replies
- 10 kudos
Latest Reply
Hi @Michael Okulik Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
10 More Replies
by
Raie
• New Contributor III
- 9049 Views
- 3 replies
- 4 kudos
What I am doing:spark_df = spark.createDataFrame(dfnew)spark_df.write.saveAsTable("default.test_table", index=False, header=True)This automatically detects the datatypes and is working right now. BUT, what if the datatype cannot be detected or detect...
- 9049 Views
- 3 replies
- 4 kudos
Latest Reply
just create table earlier and set column types (CREATE TABLE ... LOCATION ( path path)in dataframe you need to have corresponding data types which you can make using cast syntax, just your syntax is incorrect, here is example of correct syntax:from p...
2 More Replies
by
SajiD
• New Contributor
- 1317 Views
- 0 replies
- 0 kudos
Hi everyone, I am working with Databricks Notebooks and I am facing an issue with snowflake connector, I wanted to use DDL/DML with snowflake connector. Can someone please help me out with this, Thanks in advance !!
- 1317 Views
- 0 replies
- 0 kudos
- 5829 Views
- 1 replies
- 0 kudos
What does it mean that Delta Lake supports multi-cluster writes ,Please explain , Ca we write same delta table with Multiple cluster
- 5829 Views
- 1 replies
- 0 kudos
Latest Reply
It means that Delta Lake does locking to make sure that queries writing to a table from multiple clusters at the same time won’t corrupt the table. However, it does not mean that if there is a write conflict (for example, update and delete the same t...
- 1103 Views
- 1 replies
- 0 kudos
What DDL and DML features does Delta Lake not support?
- 1103 Views
- 1 replies
- 0 kudos
Latest Reply
Unsupported DDL features:ANALYZE TABLE PARTITIONALTER TABLE [ADD|DROP] PARTITIONALTER TABLE RECOVER PARTITIONSALTER TABLE SET SERDEPROPERTIESCREATE TABLE LIKEINSERT OVERWRITE DIRECTORYLOAD DATAUnsupported DML features:INSERT INTO [OVERWRITE] table wi...