Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Hello,I am experiencing issues with importing from utils repo the schema file I created.this is the logic we use for all ingestion and all other schemas live in this repo utills/schemasI am unable to access the file I created for a new ingestion pipe...
@Debayan Mukherjee Hello, thank you for your response. please let me know if these are the correct commands to access the file from notebookI can see the files in the repo folderbut I just noticed this. the file I am trying to access the size is 0 b...
Please help. Here's an example:I have one .py file and one .ipynb, and the .py file contains the test function, but after adding the new function test1, it doesn't appear in .ipynb. Even after re-running the .py file and reimporting it in .ipynb. How...
Trying to sync one folder from an external s3 bucket to a folder on a mounted S3 bucket and running some simple code on databricks to accomplish this. Data is a bunch of CSVs and PSVs.The only problem is some of the files are giving this error that t...
Hi all, So far I have been successfully using the CLI interface to upload files from my local machine to DBFS/FileStore/tables. Specifically, I have been using my terminal and the following command: databricks fs cp -r <MyLocalDataset> dbfs:/FileStor...
hi @Ignacio Castineiras ,If Arjun.kr's fully answered your question, would you be happy to mark their answer as best so that others can quickly find the solution?Please let us know if you still are having this issue.
I am trying to migrate my workload to another workspace ( from ST to E2), I am planning to use data bricks sync, but still I am not sure, will it migrate everything like , currents, user , groups, job, notebook etc or has some limitations which I s...