Voucher sending date
Completed the attached course today Nov 3rd. In order to avoid writing again and requesting a voucher, when is the voucher for this one expected to be sent?Thanks
- 1341 Views
- 0 replies
- 1 kudos
Completed the attached course today Nov 3rd. In order to avoid writing again and requesting a voucher, when is the voucher for this one expected to be sent?Thanks
I have a query that is hitting a table I have access too. Granting access to everyone is not an option. I am using this query in a SQL Dashboard. One of the where clause conditions uses a parameter populated by another query. I want this parameter qu...
It is not possible to do what I want. Somewhat seems like a security flaw but what ever
I'm needing to run the contents of a folder, which can change over time. Is there a way to set up a notebook that can orchestrate running all notebooks in a folder? My though was if I could retrieve a list of the notebooks I could create a loop to ru...
List all notebooks by making API call and then run them by using dbutils.notebook.run:import requests ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext() host_name = ctx.tags().get("browserHostName").get() host_token = ctx.apiToke...
The e-learning videos on DBacademy say we should click on "Repos" and "Add Repo"I cannot find this in my Community Edition UII am a little frustrated that there are so many different versions of the UI and many videos show UI options that we cannot ...
Hello, just import the .dbc file direct into your user workspace, as explained by Databricks here:https://www.databricks.training/step-by-step/importing-courseware-from-github/The simplest way
What is the best practice for logging in Databricks notebooks? I have a bunch of notebooks that run in parallel through a workflow. I would like to keep track of everything that happens such as errors coming from a stream. I would like these logs to ...
@Gimwell Young​ AS @Debayan Mukherjee​ mentioned if you configure verbose logging in workspace level, logs will be moved to your storage bucket that you have provided during configuration. from there you can pull logs into any of your licensed log mo...
Issue on Cluster creating new workspace: I Cannot able to create a new workspace in Databricks using Quickstart. When I am creating the workspace I get the Rollback failed error from AWS eventhoughI have given all the required informations. Kindly he...
hi @Gopichandran N​ could you please add more information on the issue that you are facing. could you please add the screenshot of the error?
I'm thinking of using autoloader to process files being put on our data lake.Let's say f.e. every 15 minutes, a parquet files is written. These files however contain overlapping data.Now, every 2 hours I want to process the new data (autoloader) and...
What about forEachBatch and then MERGE?Alternatively, run another process that will clean updates using the window function, as you said.
I need to move group of files(python or scala file from)or folder from dbfs location to user workspace directory in azure databricks to do testing on file.Its verify difficult to upload each file one by one into the user workspace directory, so is it...
Hey guys. I don't know if I'm tired, I ask for your help, but I don't understand where is the difference in the number of fields.Thanks! I'm replicating SCD type 2 based on this documentation:https://docs.delta.io/latest/delta-update.html#slowly-chan...
How to check, who did changes in databrics notebook and what are changes done in databrics notebook.
Hi there, I ran into issue with Databricks repo. When I create new repo, it doesn't pull default branch from GitLab. I have default branch 'development', but Databricks repo pulls other branch.Moreover this just now added repo already contains change...
I had a similar issue, where a few notebooks popped up as changed in every branch I created (even though they were not touched). So first I manually did the discard changes thing, but that was not a solution as with every new branch they were there ...
First, I tried to configure Autoloader in File notification mode to access the Premium BlobStorage 'databrickspoc1' (PREMIUM , ADLS Gen2). I get this Error: I get this errorcom.microsoft.azure.storage.StorageException: I checked my storage account->N...
When you created a premium account, have you chosen "Premium account type" as "File shares"? It should be "Block blobs".
Hi, I am trying to load data from datalake into SQL table using "SourceDataFrame.write" operation in a Notebook using apache spark.This seems to be loading duplicates at random times. The logs don't give much information and I am not sure what else t...
can you elaborate a bit more on this notebook?And also what databricks runtime version?
Navigate and discover content more efficiently with Search in DatabricksHi all- Justin Kim here, I'm the Databricks product manager responsible for content organization and navigation in our product, which includes Search. Great to see you on the Com...
@Justin Kim​ Thank you for quick reply, usually Last Modified is Recent changes right (that can be last 24hrs or cap limit that we add), whereas anytime they should show all Notebooks or Tables from start. that is where i got confused
What happens when jobs/create REST API command is run multiple times(say 3 times) with the same JSON configuration? Will 3 jobs are created with the same name or only 1 job will be created?
Hi @Santhosh Raj​ , logically only one job should be created.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up NowUser | Count |
---|---|
1612 | |
768 | |
348 | |
286 | |
252 |