cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Junaid_Ali
by New Contributor II
  • 679 Views
  • 0 replies
  • 0 kudos

Creating external location is Failing because of cross plane request

While creating Unity Catalog external location from Data Bricks UI or from a notebook using "CREATE EXTERNAL LOCATION location_name .." a connection is being made and rejected from control plane to the S3 data bucket in a PrivateLink enabled environm...

Get Started Discussions
Unity Catalog
VPC
  • 679 Views
  • 0 replies
  • 0 kudos
ChristianRRL
by Valued Contributor
  • 661 Views
  • 0 replies
  • 0 kudos

Source to Bronze Organization + Partition

Hi there, I hope I have what is effectively a simple question. I'd like to ask for a bit on guidance if I am structuring my source-to-bronze auto loader data properly. Here's what I have currently:/adls_storage/<data_source_name>/<category>/autoloade...

  • 661 Views
  • 0 replies
  • 0 kudos
pablobd
by Contributor II
  • 3043 Views
  • 2 replies
  • 0 kudos

Install python package from private repo [CodeArtifact]

As part of my MLOps stack, I have developed a few packages which are the published to a private AWS CodeArtifact repo. How can I connect the AWS CodeArtifact repo to databricks? I want to be able to add these packages to the requirements.txt of a mod...

  • 3043 Views
  • 2 replies
  • 0 kudos
Latest Reply
pablobd
Contributor II
  • 0 kudos

One way to do it is to run this line before installing the dependencies:pip config set site.index-url https://aws:$CODEARTIFACT_AUTH_TOKEN@my_domain-111122223333.d.codeartifact.region.amazonaws.com/pypi/my_repo/simple/But can we add this in MLFlow?

  • 0 kudos
1 More Replies
shkelzeen
by New Contributor II
  • 1326 Views
  • 1 replies
  • 0 kudos

Databricks JDBC driver multi query in one request.

Can I run multi query in one command using databricks JDBC driver and would databricks execute one query faster then running multi queries in one script?  

  • 1326 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Can I run multi query in one command using databricks JDBC driver and would databricks execute one query faster then running multi queries in one script?  

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
ChristianRRL
by Valued Contributor
  • 1640 Views
  • 2 replies
  • 0 kudos

Auto Loader Use Case Question - Centralized Dropzone to Bronze?

Good day,I am trying to use Auto Loader (potentially extending into DLT in the future) to easily pull data coming from an external system (currently located in a single location) and organize it and load it respectively. I am struggling quite a bit a...

  • 1640 Views
  • 2 replies
  • 0 kudos
Latest Reply
ChristianRRL
Valued Contributor
  • 0 kudos

Quick follow-up on this @Retired_mod (or to anyone else in the Databricks multi-verse who is able to help clarify this case).I understand that the proposed solution would work for a "one-to-one" case where many files are landing in a specific dbfs pa...

  • 0 kudos
1 More Replies
Cosmin
by New Contributor II
  • 1925 Views
  • 3 replies
  • 0 kudos

Fail to write large dataframe

Hi all, we have a issue while trying to write a quite large data frame, close to 35 million records. We try to write it as parquet and also table and none work. But writing a small chink (10k records) is working. Basically we have some text on which ...

Cosmin_2-1702640369404.png
  • 1925 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

That could work, but you will have to create a UDF.Check this SO topic for more info

  • 0 kudos
2 More Replies
vrajesh123
by New Contributor II
  • 4383 Views
  • 1 replies
  • 0 kudos

Webassessor Secure Browser will not Launch during exam.

Hello - I registered for the Databricks Data Engineering Associate Certification exam.  I hit an issue, their Secure browser would not launch, it just crashed - the only thing I could see in a flash is "bad request" and poof its gone.  Spend over 2 h...

Get Started Discussions
Certification Exam
Issues
Secure Browser
  • 4383 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Hello - I registered for the Databricks Data Engineering Associate Certification exam.  I hit an issue, their Secure browser would not launch, it just crashed - the only thing I could see in a flash is "bad request" and poof its gone.  Spend over 2 h...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
sudhanshu1
by New Contributor III
  • 8416 Views
  • 2 replies
  • 2 kudos

Python file testing using pytest

Hi All,I have a requirement in my project, where we will be writing some python code inside databricks . Please note we will not be using pyspark . It will plain pythin with polars.I am looking into ho to create test files for main file. Below is sim...

  • 8416 Views
  • 2 replies
  • 2 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 2 kudos

This widget could not be displayed.
Hi All,I have a requirement in my project, where we will be writing some python code inside databricks . Please note we will not be using pyspark . It will plain pythin with polars.I am looking into ho to create test files for main file. Below is sim...

This widget could not be displayed.
  • 2 kudos
This widget could not be displayed.
1 More Replies
pablobd
by Contributor II
  • 2543 Views
  • 1 replies
  • 0 kudos

Asset bundle build and deploy python wheel with versions

Hi all,I was able to deploy a wheel to the /Shared/ folder from a repository in Gitlab with asset bundles. The databricks.yml looks something like this.artifacts:  default:    type: whl    build: poetry build    path: .  targets:    workspace:      h...

  • 2543 Views
  • 1 replies
  • 0 kudos
Latest Reply
pablobd
Contributor II
  • 0 kudos

Finally I decided to use AWS Code Artifact and mirror the PyPI, which I think it's a bit cleaner. But your solution looks good too. Thanks!

  • 0 kudos
leelee3000
by Databricks Employee
  • 1402 Views
  • 2 replies
  • 0 kudos

time travel with DLT

Needed some help with Time Travel with Delta Live tables   We were trying to figure out if we can go in and alter the history on this table, and what would happen to data that we mass upload?  By this we mean we have data from the past that we would ...

  • 1402 Views
  • 2 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

Delta Live Tables leverage Delta Lake, or Delta Tables.  Delta tables, through transactions (e.g. insert, update, delete, merges, optimization) create versions of said Delta Table.  Once a version is created it cannot be altered, it is immutable.  Yo...

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 1448 Views
  • 1 replies
  • 0 kudos

Upgrade Spark version 3.2 to 3.4+

Hi Team,We would like to upgrade from Spark version 3.2 to 3.4+ (Databricks Runtime - 10.4 to 12.2/13.3)We would like to understand how complex upgradation is this and challenges which we face? what are the technical steps and precautions we need to ...

  • 1448 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Hi Team,We would like to upgrade from Spark version 3.2 to 3.4+ (Databricks Runtime - 10.4 to 12.2/13.3)We would like to understand how complex upgradation is this and challenges which we face? what are the technical steps and precautions we need to ...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
Phani1
by Valued Contributor II
  • 1720 Views
  • 0 replies
  • 0 kudos

Customer Managed Keys in Databricks (AWS)

Hi Databricks Team,Could you please provide me the detailed steps on how to be enabled customer managed keys in databricks (AWS) Account, if there is any video on it that would be great helpful.Regards,Phanindra

  • 1720 Views
  • 0 replies
  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors