cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Priyag1
by Honored Contributor II
  • 1524 Views
  • 2 replies
  • 4 kudos

docs.databricks.com

Databricks new Release : Full-page workspace browser includes ReposDatabricks plans to enable the full-page workspace browser experience that unifies Workspace and Repos by default. You’ll browse content in Databricks Repos alongside your workspace c...

  • 1524 Views
  • 2 replies
  • 4 kudos
Latest Reply
bharats
New Contributor III
  • 4 kudos

Thanks for the update

  • 4 kudos
1 More Replies
apiury
by New Contributor III
  • 2466 Views
  • 4 replies
  • 2 kudos

Delta file question

Hi! Im using Autoloader to ingest Binary files into delta format. I have 7 binary files but delta generate 3 files and the format is part-0000, part-0001... Why generate this files with format part-000...

image
  • 2466 Views
  • 4 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Alejandro Piury Pinzón​ We haven't heard from you since the last response from @Lakshay Goel​ r​, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be hel...

  • 2 kudos
3 More Replies
harraz
by New Contributor III
  • 1375 Views
  • 1 replies
  • 0 kudos

Issues loading files csv files that contain BOM (Byte Order Mark) character

I keep getting and error when creating dataframe or steam from certain CSV files where the header contains BOM (Byte Order Mark) character  This is the error message:AnalysisException: [RequestId=e09c7c8d-2399-4d6a-84ae-216e6a9f8f6e ErrorClass=INVALI...

  • 1375 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @mohamed harraz​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
Spark4speed
by New Contributor
  • 990 Views
  • 1 replies
  • 0 kudos

CDM connector for Spark can't connect to Azure storage account

Hello,I'm trying to use the CDM connector for Spark, but I can't connecto to the Azure storage account when using the connector. I mounted a container of storage account with a SAS-token. When I'm trying to read CDM data from a (mounted) storage acco...

  • 990 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Martijn de Bruijn​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
jlb0001
by New Contributor III
  • 1288 Views
  • 3 replies
  • 1 kudos

[AWS] How do you replace the Account Admin?

I need to remove an older admin that previously set up the Databricks Account. However, I get an error (even through I am also an Account Admin).How do I replace a prior account admin? Or at least remove their admin status and/or disable the accoun...

Databricks Permssion Error - Cannot Disable Original Account
  • 1288 Views
  • 3 replies
  • 1 kudos
Latest Reply
jlb0001
New Contributor III
  • 1 kudos

I escalated via my Databricks rep yesterday and got an answer that seemed to be along the lines that "something is wrong here". He is going to try to find out internally and possibly work with the product development folks to come up with a solution...

  • 1 kudos
2 More Replies
nlakshmidevi125
by New Contributor
  • 1530 Views
  • 2 replies
  • 1 kudos

about .crc file in delta transaction log

why .crc file will create along with delta log files

  • 1530 Views
  • 2 replies
  • 1 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 1 kudos

Hi @Lakshmi devi​ , crc file is basically a checksum file that contains the stats for the respective version file. It is used for snapshot verification in the backend.

  • 1 kudos
1 More Replies
dnchankov
by New Contributor II
  • 2843 Views
  • 2 replies
  • 2 kudos

Why my notebook I created in a Repo can be opened safe?

I've cloned a Repo during "Get Started with Data Engineering on Databricks".Then I'm trying to run another notebook from a cell with a magic %run command.But I get that the file can't be opened safe.Here my code:notebook_aname = "John" print(f"Hello ...

  • 2843 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Opening in Confined Mode is the most secure choice and you can constantly resume the record in your unique Versus Code window once you decide the document is reliable.Custom Labels and Stickers | Custom Boxes With Logo

  • 2 kudos
1 More Replies
kidexp
by New Contributor II
  • 18776 Views
  • 5 replies
  • 2 kudos

Resolved! How to install python package on spark cluster

Hi, How can I install python packages on spark cluster? in local, I can use pip install. I want to use some external packages which is not installed on was spark cluster. Thanks for any suggestions.

  • 18776 Views
  • 5 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Introduce Python bundle on flash groupMake a virtualenv only for your Flash hubs.Each time you run a Flash work, run a new pip introduce of all your own in-house Python libraries. ...Zoom up the site-bundles dir of the virtualenv. ...Pass the single ...

  • 2 kudos
4 More Replies
Mikki007
by New Contributor II
  • 3881 Views
  • 2 replies
  • 1 kudos

Resolved! Constructor public org.apache.spark.SparkConf(boolean) is not whitelisted.

My code:from great_expectations.datasource import SparkDFDatasourcefrom pyspark.sql import SparkSessionsession_name = 'mk_spark_session'spark = SparkSession.builder.appName(session_name).getOrCreate()datasource = SparkDFDatasource(spark)query = "SELE...

  • 3881 Views
  • 2 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Moderator
  • 1 kudos

Are you using a high concurrency cluster? if you do, please try to run this code on standard cluster

  • 1 kudos
1 More Replies
parthsalvi
by Contributor
  • 2681 Views
  • 2 replies
  • 1 kudos

getContext() in dbutils.notebook not working in DBR 11.2 10.4 LTS Shared Mode It's also working in no isolation Mode in DBR 11.2

We are trying to fetch notebook context in our Job logging workflow.current_context = dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson()We were able to access this in DBR 10.4 custom mode but in DBR 10.4 & 11.2 (Shared Mode) w...

image
  • 2681 Views
  • 2 replies
  • 1 kudos
Latest Reply
Tjomme
New Contributor III
  • 1 kudos

See also: https://community.databricks.com/s/question/0D58Y00009t95NHSAY/unity-catalog-shared-access-mode-dbutilsnotebookentrypointgetcontext-not-whitelisted

  • 1 kudos
1 More Replies
yopbibo
by Contributor II
  • 7259 Views
  • 6 replies
  • 0 kudos

Resolved! How to copy the content of a repos, in the workspace/shared, automatically, daily?

How to copy the content of a repos, in the workspace/shared, automatically, daily?Purpose here is to bring some notebooks, in shared, available to all workspace users, without requesting users to use REPOS.

  • 7259 Views
  • 6 replies
  • 0 kudos
Latest Reply
citizenkrank
New Contributor II
  • 0 kudos

Alternatively, you can schedule a notebook with the following cell:%sh cp -r /Workspace/Repos/username/repo_name /Workspace/SharedPlease note that you'll have to update (i.e. pull) the repo manually if you've updated it somewhere else (although you c...

  • 0 kudos
5 More Replies
gilo12
by New Contributor III
  • 508 Views
  • 0 replies
  • 1 kudos

Connect to warehouse vs connect to compute/cluster

When I create a warehouse, there are connection details which I was able to successfully use, but there are also connection details when I navigate in the console to Compute->configuration->Advanced options-> JDBC/ODBC What is the difference between ...

  • 508 Views
  • 0 replies
  • 1 kudos
Kash
by Contributor III
  • 2574 Views
  • 3 replies
  • 0 kudos

Linear Regression HELP! Pickle + Broadcast Variable Error

Hi there,I need some help with this example. We're trying to create a linearRegression model that can parallelize for thousands of symbols per date. When we run this we get a picklingError Any suggestions would be much appreciated!KError:PicklingErro...

  • 2574 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kash
Contributor III
  • 0 kudos

@Vidula Khanna​ Can you assist?

  • 0 kudos
2 More Replies
BorislavBlagoev
by Valued Contributor III
  • 4035 Views
  • 4 replies
  • 9 kudos

Resolved! Delta save timestamp as timestamp with time zone

Hello! I have the following problem. I want to save the delta table and that table contains timestamp columns, but when I try to write that table with spark the timestamp columns become timestamp with the time zone. This is a problem in my case becau...

  • 4035 Views
  • 4 replies
  • 9 kudos
Latest Reply
Bhimaniheet
New Contributor II
  • 9 kudos

Hi @Hubert Dudek​ ,When you have time, I describe my problem. Can you please check?

  • 9 kudos
3 More Replies
Saurabh707344
by New Contributor III
  • 4682 Views
  • 2 replies
  • 1 kudos

Platform and Approach Comparison

Do anyone have structure and crisp comparison between benefits of performing MLOps using below ways and what are the strong areas of each platform:a) Standalone Databricks where all pipelines and orchestration done on Databricks and external third pa...

  • 4682 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Saurabh Singh​, Here is a structured and crisp comparison of the benefits and strong areas of each platform for performing MLOps:a) Standalone Databricks:Benefits: Unified platform: Databricks provides a unified environment for data engineering, ...

  • 1 kudos
1 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels