- 1239 Views
- 5 replies
- 9 kudos
What's the best way to set up a private link connection to Databricks without loosing the current settins? workspace
- 1239 Views
- 5 replies
- 9 kudos
Latest Reply
At the data and AI summit, there was a nice talk about AWS private link. Here is a link if you have access:https://dataaisummit.com/session-virtual/?v2477da705118cc74fd14460db021e1784e2eed5a7982c6482ec95cb2e86d259644b8741959f52a49e0e6908b82a9d860=C22...
4 More Replies
- 1799 Views
- 5 replies
- 2 kudos
We have written few python functions(methods within a class) and packaged them as a wheel library.In the as-is situation we use to install that wheel library in All-Purpose cluster that we already have created. It works fine.In the to-be situtation(D...
- 1799 Views
- 5 replies
- 2 kudos
Latest Reply
Does it give you an error when running the DLT pipeline specifically on the %pip command or does it not work in some other way? If it's the former, could you share the path format that you're using for the %pip command path?
4 More Replies
- 571 Views
- 1 replies
- 0 kudos
Hey i am beginner and i am intrested to learn data engineering path for azure databrics.Can some one please provide some guidance
- 571 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @ishant jain​, The Databricks Data Science and Engineering guide provide how-to guidance to help you get the most out of the Databricks collaborative analytics platform. For tutorials and introductory information, see Get started with Databricks a...
- 2341 Views
- 4 replies
- 6 kudos
Hi,Is it possible to restrict upload files to dfbs root (Since everyone has access) ? The idea is to force users to use an ADLS2 mnt with credential passthrough for security reasons.Also, right now users use azure blob explorer to interact with ADLS2...
- 2341 Views
- 4 replies
- 6 kudos
Latest Reply
Hi @E H​, We haven't heard from you on the last response from @Arvind Ravish​ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to others.Also, p...
3 More Replies
- 1190 Views
- 1 replies
- 1 kudos
I want to convert the DataFrame to nested json. Sourse Data:-DataFrame have data value like :- As image 2 Expected Output:-I have to convert DataFrame value to Nested Json like : -As image 1Appreciate your help !
- 1190 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @Suman Mishra​, This article explains how to convert a flattened DataFrame to a nested structure by nesting a case class within another case class.You can use this technique to build a JSON file that can then be sent to an external API.
- 1488 Views
- 1 replies
- 0 kudos
I'm a new student to programming world, have strong interest in data engineering and databricks technology. I've tried this product, the UI, notebook, dbfs are very user-friendly and powerful.Recently, a doubt came to my mind why databricks doesn't s...
- 1488 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @tony gao​, Databricks doesn’t support the Java notebook execution directly. You can only run the notebook in R, Python, and Scala. However, there are two ways in which you can run the java code on the Azure Databricks cluster.1. Create a jar of j...
- 712 Views
- 1 replies
- 3 kudos
Hi , I created the community cloud account , even I got a mail for resetting password . But once I try to log in to https://community.cloud.databricks.com/login.html , it does not give error , but simple hanging for some time and again login screen ...
- 712 Views
- 1 replies
- 3 kudos
Latest Reply
Hi @Debashis Mallick​, Thank you for reaching out!Let us look into this for you, and we'll check back with an update.Please share your relevant details as well as screenshots to community@databricks.com.
- 681 Views
- 2 replies
- 2 kudos
Hi Hubert!I'm working on a usecase to get the compute usage stats. I used boto3 code and describe_cluster() function to get the normalized instance hours value for the EMR cluster. I would like to know if there is an equivalent for this normalized in...
- 681 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @Bala Kowsalya​, We haven't heard from you on the last response from @Hubert Dudek​ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to other...
1 More Replies
- 2887 Views
- 3 replies
- 4 kudos
In Spark, is it possible to create a persistent view on a partitioned parquet file in Azure BLOB? The view must be available when the cluster restarted, without having to re-create that view, hence it cannot be a temp view.I can create a temp view, b...
- 2887 Views
- 3 replies
- 4 kudos
Latest Reply
Here is what worked for me. Hope this helps someone else: https://stackoverflow.com/questions/72913913/spark-persistent-view-on-a-partition-parquet-file/72914245#72914245CREATE VIEW test as select * from parquet.`/mnt/folder-with-parquet-file(s)/`@Hu...
2 More Replies
by
bdugar
• New Contributor II
- 8642 Views
- 2 replies
- 2 kudos
Hi:It's possible to create temp views in pyspark using a dataframe (df.createOrReplaceTempView()), and it's possible to create a permanent view in Spark SQL. But as far as I can tell, there is no way to create a permanent view from a dataframe, somet...
- 8642 Views
- 2 replies
- 2 kudos
Latest Reply
Hi Kaniz:This is what I understood from the research I did, I was curious more as to why permanent views can't be created from dataframes and whether this is a feature that might be implemented by Databricks or Spark at some point. Temporary views ca...
1 More Replies
by
rk66
• New Contributor
- 313 Views
- 0 replies
- 0 kudos
Today, the entire private limited company registration process and other regulatory filings are paperless; documents are filed electronically through the MCA website and is processed at the Central Registration Centre (CRC). Online Private Limited Co...
- 313 Views
- 0 replies
- 0 kudos
by
158808
• New Contributor II
- 1216 Views
- 5 replies
- 4 kudos
Hello,Using odbc 2.6.24.1041-2 for Linux, when inserting rows with milliseconds precision date (e.g. 2022-07-03 13:57:48.500) precision I get:2022/07/03 14:41:19 SQLExecute: {22008} [Simba][Support] (40520) Datetime field overflow resulting from inva...
- 1216 Views
- 5 replies
- 4 kudos
Latest Reply
I was passing a string (e.g. '2022-07-03 13:57:48.500') to the Golang SQL driver which is not working if the ms part is specified, but otherwise it works (e.g. '2022-07-03 13:57:48'). Passing a native Golang time.Time seems to work for timestamps wit...
4 More Replies
- 3116 Views
- 3 replies
- 3 kudos
- 3116 Views
- 3 replies
- 3 kudos
Latest Reply
Hi @Madelyn Mullen​ , Thank you for sharing such an excellent and informative post. We hope to see these very often.
2 More Replies