cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Rishabh-Pandey
by Esteemed Contributor
  • 1114 Views
  • 0 replies
  • 1 kudos

Live, Virtual Workshop How to build a Golden Data Warehouse in Financial Services with Databricks

Reasons to join: Most Financial Services organizations have major on-prem investments. You can use that as your starting point to activate your organization on gold-level insights in the cloud.Providing a path to easier and quicker migration to the c...

  • 1114 Views
  • 0 replies
  • 1 kudos
Sourav789027
by New Contributor II
  • 758 Views
  • 1 replies
  • 1 kudos

Databricks Certifications

Hello Everyone , My name is Sourav Das. I am from Kolkata, currently working as Azure Data Engineer in Cognizant.I have cleared multiple databricks certifications(Databricks data engineer associate, databricks data engineer professional, databricks d...

  • 758 Views
  • 1 replies
  • 1 kudos
Latest Reply
gchandra
Databricks Employee
  • 1 kudos

Good luck. You can continue to improve your skills by helping other community members on this platform.

  • 1 kudos
endaemon
by New Contributor II
  • 2723 Views
  • 4 replies
  • 4 kudos

Resolved! Differences among python libraries

I am confused as to the differences between various python libraries for databricks: especially with regard to differences among [databricks-connect](https://pypi.org/project/databricks-connect/), [databricks-api](https://pypi.org/project/databricks-...

  • 2723 Views
  • 4 replies
  • 4 kudos
Latest Reply
endaemon
New Contributor II
  • 4 kudos

@szymon_dybczak,Thank you for typing all that up. It is very clear and helpful.Two follow ups if I may:1. If one's primary goal is to execute SQL queries why prefer databricks sql connector over a generic jdbc or odbc package?2. Did I miss any other ...

  • 4 kudos
3 More Replies
Sourav7890
by New Contributor III
  • 1251 Views
  • 0 replies
  • 0 kudos

Delta Live Table (Real Time Usage & Application)

Delta Live Tables are the Hot Topic in Data Field, innovation by Databricks. Delta Live Table is a Declarative ETL framework. In ETL two types of ETL frame works are there -1) procedure ETL 2)Declarative ETL1)procedure ETL- it involves writing code t...

  • 1251 Views
  • 0 replies
  • 0 kudos
DineshReddyN
by New Contributor II
  • 2179 Views
  • 5 replies
  • 0 kudos

Filestore endpoint not visible in Databricks community edition

In community edition of Databricks after multiple attempts of enable, refreshes, unable to navigate to File store endpoint.Under catalog it is not visible 

DineshReddyN_0-1729153172344.png
Get Started Discussions
Databricks Community Edition
filestore
GUI
  • 2179 Views
  • 5 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

Follow these alternate solutions. https://community.databricks.com/t5/data-engineering/databricks-community-edition-dbfs-alternative-solutions/td-p/94933

  • 0 kudos
4 More Replies
Sudheer2
by New Contributor III
  • 824 Views
  • 1 replies
  • 0 kudos

Migrating ML Model Experiments Using Python REST APIs

Hi everyone,I’m looking to migrate ML model experiments from a source Databricks workspace to a target workspace. Specifically, I want to use Python and the available REST APIs for this process.Can anyone help me on this!Thanks in advance!

  • 824 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

You can use https://github.com/mlflow/mlflow-export-import  utility. The example given below doesn't use Python but uses CLI and CICD pipeline to do the same.  https://medium.com/@gchandra/databricks-copy-ml-models-across-unity-catalog-metastores-188...

  • 0 kudos
redapplesonly
by New Contributor II
  • 2806 Views
  • 2 replies
  • 3 kudos

Resolved! Access Databricks Table with Simple Python3 Script

Hi, I'm super new to Databricks. I'm trying to do a little API scripting against my company's DB instance.I have this supersimple python (ver 3) which is meant to run a remote host.  The script tries to a simple SQL query against my Databricks instan...

  • 2806 Views
  • 2 replies
  • 3 kudos
Latest Reply
redapplesonly
New Contributor II
  • 3 kudos

@gchandra Yes!  This is the documentation I was seeking!  Thank you so much

  • 3 kudos
1 More Replies
Linda19
by New Contributor
  • 2671 Views
  • 3 replies
  • 2 kudos

What is the Best Postman Alternative?

Hey guys, I have been using Postman for quite some time now and getting disappointed recently and want to make a swtich. Is there something better than Postman? I've heard about that APIDog is much easier to use with a much better UI, and support all...

  • 2671 Views
  • 3 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

There can be only one:curl

  • 2 kudos
2 More Replies
Phani1
by Valued Contributor II
  • 1884 Views
  • 1 replies
  • 0 kudos

incremental loads without date column

Hi All,We are facing a situation where our data source is Snowflake, and the data is saved in a storage location(adls) in parquet format. However, the tables or data lack a date column or any incremental column for performing incremental loads to Dat...

  • 1884 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Ideally you would have some change tracking system (cdc f.e.) on the source tables (Streams in the case of Snowflake, Introduction to Streams | Snowflake Documentation).But that is not the case.So I think you approach is ok.  You cannot track what is...

  • 0 kudos
abubakar-saddiq
by New Contributor
  • 3915 Views
  • 2 replies
  • 1 kudos

How to Pass Dynamic Parameters (e.g., Current Date) in Databricks Workflow UI?

I'm setting up a job in the Databricks Workflow UI and I want to pass a dynamic parameter, like the current date (run_date), each time the job runs.In Azure Data Factory, I can use expressions like @utcnow() to calculate this at runtime. However, I w...

  • 3915 Views
  • 2 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

As szymon mentioned, dynamic parameter values exist, but the functionality is still far from what Data Factory has to offer.I am pretty sure though that this will be extended.So for the moment I suggest you do the value derivation in data factory, an...

  • 1 kudos
1 More Replies
david_nagy
by New Contributor III
  • 2697 Views
  • 7 replies
  • 1 kudos

Databricks bundle

Hey, I am new to Databricks, and I am trying to test the mlops-stack bundle. Within that bundle there is a feature-engineering workflow and I have a problem to make it run. The main problem is the following.the bundle specified the target to be $bund...

  • 2697 Views
  • 7 replies
  • 1 kudos
Latest Reply
david_nagy
New Contributor III
  • 1 kudos

Yes it is.

  • 1 kudos
6 More Replies
Phani1
by Valued Contributor II
  • 1482 Views
  • 1 replies
  • 0 kudos

Oracle -> Oracle Golden Gate ->Databricks Delta lake

Hi All,We have a situation where we are collecting data from different Oracle instances.The customer is using Oracle GoldenGate to replicate this data into a storage location.From there, we can use Auto Loader or Delta Live Tables to read Avro files ...

  • 1482 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Phani1 ,In my opinion this is really good setup. You have push scenario where Oracle GoldenGate  is responsible for delivering data into storage, so you don't have to bother about extraction part. And autoloader is the best choice when it comes t...

  • 0 kudos
Phani1
by Valued Contributor II
  • 494 Views
  • 0 replies
  • 0 kudos

Delta Lake to Oracle Essbase

Hi All,How can we connect Databricks Delta Lake to Essbase in OCI? We know that Essbase supports JDBC/ODBC. Is it possible to use Python or PySpark to read from Delta Lake and load the data into Essbase? I think using JDBC/ODBC might affect performan...

  • 494 Views
  • 0 replies
  • 0 kudos
Phani1
by Valued Contributor II
  • 1143 Views
  • 0 replies
  • 0 kudos

Denodo Connection Parameters.

 Hi All,We are establishing a connection from Denodo to Databricks. During the development phase, we utilized a personal access token associated with  developer account. However, this approach is not considered a best practice for production environm...

  • 1143 Views
  • 0 replies
  • 0 kudos
Surajv
by New Contributor III
  • 1830 Views
  • 2 replies
  • 0 kudos

Restrict access of user/entity to hitting only specific Databricks Rest APIs

Hi community,Assume I generate a personal access token for an entity. Post generation, can I restrict the access of the entity to specific REST APIs? In other words, consider this example where once I use generate the token and setup a bearer token b...

  • 1830 Views
  • 2 replies
  • 0 kudos
Latest Reply
Panda
Valued Contributor
  • 0 kudos

@Surajv You have to rely on access control settings on resources and entities (users or service principals or create some cluster policies), rather than directly restricting the API endpoints at the token level.Note: API access based on fine-grained ...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels