- 1880 Views
- 2 replies
- 0 kudos
Records are missing while creating new data from one big dataframe using filter
Hi,I have data in file like below I have different types of row in my input file, column number 8 defines the type of the record.In the above file we have 4 types of records 00 to 03My requirement is:There will be multiple files in the source path, e...
![Policepatil_1-1693806659492.png](/skins/images/CB7418F8951E32D67D1ABDA2C91C6335/responsive_peak/images/image_unmoderated.gif)
![Policepatil_3-1693806860560.png](/skins/images/CB7418F8951E32D67D1ABDA2C91C6335/responsive_peak/images/image_unmoderated.gif)
![Policepatil_4-1693807544901.png](/skins/images/CB7418F8951E32D67D1ABDA2C91C6335/responsive_peak/images/image_unmoderated.gif)
![Policepatil_5-1693807898507.png](/skins/images/CB7418F8951E32D67D1ABDA2C91C6335/responsive_peak/images/image_unmoderated.gif)
- 1880 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod ,If i run again with same files sometimes records will be missed from same files of the previous run or records will be missed from different file.Example:run1: 1 record missing in file1, no issue with other filesrun2: 1 record missin...
- 0 kudos
- 3028 Views
- 0 replies
- 0 kudos
Dashboard backup/download
Hola all, I'm trying to download all the dashboard definitions, however, I can only download the folder structure with no file inside. The procedure I'm using is* Go to the dashboard folder* Download it as DBC or source archiveUnfortunately, the DBC ...
- 3028 Views
- 0 replies
- 0 kudos
- 1709 Views
- 2 replies
- 0 kudos
Error while creating external table in unity catalog
I am trying to create an external table using csv file which is stored in ADLS gen2 My account owner has created a storage credential and an external location I am a databricks user who all privileges on external location when trying to create a tabl...
- 1709 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Shubhanshu , Could you please try https://learn.microsoft.com/en-us/answers/questions/1314651/error-invalid-configuration-value-detected-for-fs and see if this is working?
- 0 kudos
- 1472 Views
- 1 replies
- 0 kudos
Issue with Databricks notebooks and 'requests' python library. Incosistent output.
I have a strange issue with Databricks notebooks and Google Colab notebooks, where I cannot get the results from requests library that would be consistent to what I get on my local computer. Dir Surveys (wyo.gov) ("http://pipeline.wyo.gov/r_Direction...
- 1472 Views
- 1 replies
- 0 kudos
- 0 kudos
Additional information: when I download html file from the above mentioned website, I still get not the same html page as I get from the same code that I run on my computer or any other computer. The size of downloaded html file is the same as I get ...
- 0 kudos
- 1734 Views
- 1 replies
- 0 kudos
REST API
Creating an application to capture cluster metrics and sending HTTP REST request to the Spark History Server's API endpoint to retrieve a list of applications. This request doesn't generate logs in the Spark History Server's log files. The Spark Hist...
- 1734 Views
- 1 replies
- 0 kudos
- 2955 Views
- 1 replies
- 0 kudos
pass a tuple as parameter to sql query
at_lst = ['131','132','133'] at_tup = (*at_lst,) print(at_tup) ('131','132','133')<div> <div><span>In my sql query, i am trying to pass this on a parameter, however, it doesn't work. <div> <div><div><div><span>%sql<br /><div><span>select * from ma...
- 2955 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod I am writing sql using the magic command in the cell block, `%%sql`. Is there a way to pass a parameter in the query without using the `execute` method of the cursor object? Can you please share an example?
- 0 kudos
- 2520 Views
- 0 replies
- 0 kudos
Ingesting PowerBI Tables to databricks
Hi Community,I am looking for a way to access the Power BI tables from databricks and import them as a spark dataframe into my databricks notebook.As far as I have seen, there is a Power BI connector to load data from databricks into Power BI but not...
- 2520 Views
- 0 replies
- 0 kudos
- 1919 Views
- 0 replies
- 0 kudos
Changing StreamingWrite API in DBR 13.1 may lead to incompatibility with Spark 3.4
I'm using StarRocks Connector[2] to ingest data to StarRocks on DataBricks 13.1 (powered by Spark 3.4.0). The connector could run on community Spark 3.4, but fail on the DBR. The reason is (the full stack trace is attached)java.lang.IncompatibleClass...
- 1919 Views
- 0 replies
- 0 kudos
- 827 Views
- 0 replies
- 0 kudos
Facing UNKNOWN_FIELD_EXCEPTION.NEW_FIELDS_IN_FILE
[UNKNOWN_FIELD_EXCEPTION.NEW_FIELDS_IN_FILE] Encountered unknown fields during parsing: [<field_name>], which can be fixed by an automatic retry: trueI am using Azure Databricks, and write with python code. Want to catch the error and raise. Tried wi...
- 827 Views
- 0 replies
- 0 kudos
- 2404 Views
- 0 replies
- 0 kudos
- 2404 Views
- 0 replies
- 0 kudos
- 3660 Views
- 0 replies
- 0 kudos
- 3660 Views
- 0 replies
- 0 kudos
- 1607 Views
- 0 replies
- 0 kudos
- 1607 Views
- 0 replies
- 0 kudos
- 708 Views
- 0 replies
- 0 kudos
Data categories for databases
Is there a way to automate data categorisation with OpenAI API?
- 708 Views
- 0 replies
- 0 kudos
- 4961 Views
- 0 replies
- 0 kudos
Permissions on Delta Live Table (DLT) pipelines
I have a large collection - and growing daily - of DLT pipelines and I need to grant access to non-admin users. Do I need to assign permissions on each individual DLT pipeline or is there a better approach?
- 4961 Views
- 0 replies
- 0 kudos
- 800 Views
- 1 replies
- 0 kudos
How do I regulate notebook cache
I am experiencing error of over caching in databricks notebook. If i display different dfs one of the dfs get cache which after the result of others afterwards. Please how can I avoid the cache memory while using the notebook?
- 800 Views
- 1 replies
- 0 kudos
- 0 kudos
I don't exactly understand what your issue is. Can you elaborate more?
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »