- 5109 Views
- 3 replies
- 5 kudos
I am trying to write data frame data into delta table. Previously it was working fine but now it is throwing "Log has failed integrity checks"
- 5109 Views
- 3 replies
- 5 kudos
Latest Reply
@Shanmuganathan Jothikumar I've the same exception after upgrading into unity catalog. Need to investigate a little more but adding the following setting, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)
2 More Replies
- 3802 Views
- 3 replies
- 4 kudos
Hi everyone, I have a pyspark streaming reading from an aws kinesis that suddenly failed for no reason (I mean, we did not make any changes in the last time).It is giving the following error: ERROR MicroBatchExecution: Query kinesis_events_prod_bronz...
- 3802 Views
- 3 replies
- 4 kudos
Latest Reply
@patricio tojo I've the same problem, however in my case is after migrating into unity catalog. Need to investigate a little more but adding this to my spark job, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)
2 More Replies
- 9526 Views
- 5 replies
- 2 kudos
I have been following the documentation on the terraform databricks documentation in order to provision account level resources on AWS. I can create the workspace fine, add users, etc... However, when I go to use the provider in non-mws mode, I am re...
- 9526 Views
- 5 replies
- 2 kudos
Latest Reply
So the answer to this was that you need to explicitly pass the provider argument to each of the data resources blocks. The docs should be updated to accommodate that. i.e. data "databricks_spark_version" "latest" {
provider = databricks.workspace
...
4 More Replies
- 2298 Views
- 1 replies
- 7 kudos
Is there any upcoming update in which UC will support DLT? if yes any expected ETA?
- 2298 Views
- 1 replies
- 7 kudos
Latest Reply
Pat
Honored Contributor III
Hi @Rahul Mishra,I think you need to contact your comapny databricks representative for this.The last time I heard about the ETA It was end of the November I believe.You might try to join the Databricks Office Hours tomorrow and ask the question or ...
- 831 Views
- 0 replies
- 5 kudos
Topic: Radical Speed on the Lakehouse: Photon under the hoodI am Hari and I works as a Specialist Solutions Architect at Databricks. I specialise in Data engineering and Cloud platforms problems helping client in EMEA.Purpose:I recently presented a t...
- 831 Views
- 0 replies
- 5 kudos
- 921 Views
- 0 replies
- 7 kudos
This post is regarding Data Streaming on the Lakehouse' session in Data + AI World Tour 2022 in London. I am Resident Solutions Architect at Databricks. I specialise in data engineering.In this session, I talked about how to leverage real time data t...
- 921 Views
- 0 replies
- 7 kudos
- 13768 Views
- 2 replies
- 3 kudos
By default, we return back up to 1000 query results when a user runs a cell in Databricks. E.g., if you run display(storeData) and you have ten million customers, the UI will show the first 1000 results. If you graph that by age of customer, similarl...
- 13768 Views
- 2 replies
- 3 kudos
Latest Reply
This is simple in Databricks SQL, just uncheck LIMIT 1000 in the drop down.
1 More Replies
- 6940 Views
- 2 replies
- 2 kudos
Is there any way to change the 1000 for the display row limit at workspace, cluster and notebook level?
- 6940 Views
- 2 replies
- 2 kudos
Latest Reply
This is simple in Databricks SQL, just uncheck LIMIT 1000 in the drop down.
1 More Replies
- 8243 Views
- 4 replies
- 8 kudos
Hi, I would like to be able to do something like this...create table if not exists table1using parquetlocation = '/mnt/somelocationsome location needs to be a concatenation of static and code generated string. Documentation suggests that location onl...
- 8243 Views
- 4 replies
- 8 kudos
Latest Reply
FString Python can be used. example > spark.sql(f"CREATE TABLE {table_name} (id INT, name STRING, value DOUBLE, state STRING)")
3 More Replies
- 2700 Views
- 2 replies
- 5 kudos
Hi! can anyone please help me with a documentation which can help me set up integration between data bricks with AWS without a QuickStart default cloud formation template. I would want to use my own CFT rather than using the default due to security ...
- 2700 Views
- 2 replies
- 5 kudos
Latest Reply
Pat
Honored Contributor III
Hi @Atul S ,I think that terraform is recommended way to go with Databricks deployment. I mean it's also supported now by the Databricks support.I haven't look much on the CloudFormation setup, because we decided to go with the Terraform in the comp...
1 More Replies
by
LJ
• New Contributor II
- 10953 Views
- 1 replies
- 4 kudos
list1 = ('alpha', 'beta', 'gamma', 'eta', 'Theta')list2 = ('alpha', 'beta')df1 = spark.createDataFrame(list1, 'String').withColumnRenamed('value', 'dataset')df2 = spark.createDataFrame(list2, 'String').withColumnRenamed('value', 'dataset')df = df1.ex...
- 10953 Views
- 1 replies
- 4 kudos
Latest Reply
Every time the user change selection in the widget, you get that selection—behavior after the dropdown change you can define in widget settings.
- 6272 Views
- 0 replies
- 5 kudos
I have upgraded my expired Student subscription to 'Azure subscription 1' in Azure portal today. I want to use Databricks for personal projects as pay-as-you-go.When I go to my Databricks workspace and to my notebook and try to create a cluster,Comp...
- 6272 Views
- 0 replies
- 5 kudos
- 1114 Views
- 1 replies
- 1 kudos
Hi @Kaniz Fatma Attended the webinar on 18th Oct uploaded the datalakehouse cert but didnt recieve voucher as of now,also didnt recieve data engg associate certificate.Please help!!
- 1114 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @Arindam Bose Just a friendly- follow-up, have you got your certification and badge? If yes, please mark the answer as best.Thanks and Regards
- 2170 Views
- 0 replies
- 2 kudos
I will appreciate if there is a black friday deal on the Databricks Data Engineering Associate course or if I can get a personal coupon.
- 2170 Views
- 0 replies
- 2 kudos