- 1311 Views
- 6 replies
- 3 kudos
Hi,I like to create a web form with displayHTML in a notebook cell and when the users presses the post button, i like to write the content of my text area of my form back in to the code cell of the notebook.Example:displayHTML ("""<form><textarea> u...
- 1311 Views
- 6 replies
- 3 kudos
Latest Reply
Hi @afshin riahi​ ,Did Dan's response helped you to solve your question? if it did, can you mark it as best answer? I will help to move the post to the top so other can quickly find the solution.
5 More Replies
by
cig0
• New Contributor II
- 2537 Views
- 6 replies
- 2 kudos
Hi,We followed this document (https://docs.databricks.com/administration-guide/cloud-configurations/aws/vpc-peering.html) describing how to establish a connection between two (or more) VPC in AWS, but so far we haven't been able to communicate with t...
- 2537 Views
- 6 replies
- 2 kudos
Latest Reply
Hi @Martin Cigorraga​ ,If Huaming's fully answered your question, would you be happy to mark their answer as best so that others can quickly find the solution?
5 More Replies
by
TJS
• New Contributor II
- 12903 Views
- 6 replies
- 5 kudos
Hello, I am trying to use MLFlow on a new high concurrency cluster but I get the error below. Does anyone have any suggestions? It was working before on a standard cluster. Thanks.py4j.security.Py4JSecurityException: Method public int org.apache.spar...
- 12903 Views
- 6 replies
- 5 kudos
Latest Reply
@Tom Soto​ We have a workaround for this. This cluster spark configuration setting will disable py4jSecurity while still enabling passthrough spark.databricks.pyspark.enablePy4JSecurity false
5 More Replies
- 5455 Views
- 9 replies
- 2 kudos
I done the Data Engineering Profissional and others training in a Self-Placed Trainning (https://www.linkedin.com/posts/wscardua_data-engineering-professional-activity-6851487238774108160-IsTE) . How many hours can I estimate for this training (and o...
- 5455 Views
- 9 replies
- 2 kudos
- 1099 Views
- 2 replies
- 4 kudos
Hi community!I would like to know if it is possible to start a Multi-task Job Run from and specific task. The use case is as follows:I have a 17 tasks JobA task in the middle, let's say a task after 2 dependencies, failsI found the error and now it i...
- 1099 Views
- 2 replies
- 4 kudos
Latest Reply
+1 to what @Dan Zafar​ said. We're working **** ** this. Looking forward to bring this to you in the near future.
1 More Replies
- 8379 Views
- 2 replies
- 0 kudos
I have a fixed length file ( a sample is shown below) and I want to read this file using DataFrames API in Spark using SCALA(not python or java). Using DataFrames API there are ways to read textFile, json file and so on but not sure if there is a wa...
- 8379 Views
- 2 replies
- 0 kudos
Latest Reply
Find the below solution which can be used. Let us consider this is the data in the file. EMP ID First Name Last Name 1Chris M 2John ...
1 More Replies
- 4248 Views
- 4 replies
- 2 kudos
Hello, I am trying to host my application on Databricks and I want to expose rest APIs of my application to be accessed from postman but I am unable to find any documentation on how to do this. I tried to write simple flask "hello world" code to try ...
- 4248 Views
- 4 replies
- 2 kudos
Latest Reply
I did this using Azure web app and exposed the APIs , was able to access that in Post Man and Data bricks. Not used python app on data bricks
3 More Replies
- 1279 Views
- 1 replies
- 0 kudos
How to setup a private git repository in my workspace?
- 1279 Views
- 1 replies
- 0 kudos
Latest Reply
As a platform engineer, I would go to the admin console and click on "workspace settings" and start by looking into the below settings. Repos: true, so that Repos integration is possibleThe next two settings, are important to make the overall experi...
by
Rnmj
• New Contributor III
- 8749 Views
- 5 replies
- 7 kudos
I am trying to run a python code where a json file is flattened to pipe separated file . The code works with smaller files but for huge files of 2.4 GB I get below error:ConnectException: Connection refused (Connection refused)Error while obtaining a...
- 8749 Views
- 5 replies
- 7 kudos
Latest Reply
Hi @Jose Gonzalez​ , @Werner Stinckens​ @Kaniz Fatma​ ,Thanks for your response .Appreciate a lot. The issue was in the code, it was a python /panda code running on Spark. Due to this only driver node was being used. i did validate this by increasin...
4 More Replies
by
Braxx
• Contributor II
- 1436 Views
- 1 replies
- 3 kudos
I have a simple API request to query a table and retrive data, which are then suited into a dataframe. May happened, it fails due to different reasons. How to retry it for let's say 5 times when any kind of error takes place? Here is an api request:d...
- 1436 Views
- 1 replies
- 3 kudos
Latest Reply
@Bartosz Wachocki​ ,Use timeout, retry interval ,recursion and exception handling pseudo code belowtimeout = 300def exec_query(query,timeout): try: df = spark.createDataFrame(sf.bulk.MyTable.query(query)) except: if timeout > 0 : sleep(60) exec_que...
by
trm
• New Contributor II
- 1061 Views
- 2 replies
- 2 kudos
Hi All,i am new to azure databricks , i am using pyspark .. we need to configure mail alerts when notebook failed or succeeded ..please can some one help me in mail configuration azure data bricks .Thanks
- 1061 Views
- 2 replies
- 2 kudos
Latest Reply
the easiest way to schedule notebooks in Azure is to use Data Factory.In Data Factory you can schedule the notebooks and define the alerts you want to send.The other option is the one Hubert mentioned.
1 More Replies
- 4925 Views
- 3 replies
- 9 kudos
Hello,I've installed databricks-connect on Windows 10:C:\Users\danoshin>pip install -U "databricks-connect==9.1.*"
Collecting databricks-connect==9.1.*
Downloading databricks-connect-9.1.2.tar.gz (254.6 MB)
|████████████████████████████████| 2...
- 4925 Views
- 3 replies
- 9 kudos
Latest Reply
@Dmitry Anoshin​ , that seems messed up.the best you can do is to remove databricks connect and also to uninstall any pyspark installation.And then follow the installation guide.It should work after following the procedure.I use a Linux VM for this p...
2 More Replies
- 3934 Views
- 5 replies
- 3 kudos
We know that Databricks with VNET injection (our own VNET) allows is to connect to ADLS Gen2 over private endpoints. This is what we typically do.We have a customer who created Databricks with EnableNoPublicIP=Yes (secure cluster connectivity) and Vn...
- 3934 Views
- 5 replies
- 3 kudos
Latest Reply
Managed VNET is locked and allows very limited config tuning like VNET peering that too facilitated and needs to be done from Databricks UI. If they want more control on VNET they need to migrate to VNET injected workspace.
4 More Replies