โ05-01-2024 01:52 AM - edited โ05-01-2024 02:08 AM
I need help with migrating from dbfs on databricks to workspace. I am new to databricks and am struggling with what is on the links provided.
My workspace.yml also has dbfs hard-coded. Included is a full deployment with great expectations.
This was done by an external vendor.
โ05-21-2024 02:18 AM
#!/bin/bash
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
curl https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/prod.list > /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get install -y msodbcsql17
# optional: for bcp and sqlcmd
sudo ACCEPT_EULA=Y apt-get install -y mssql-tools
echo 'export PATH="$PATH:/opt/mssql-tools/bin"' >> ~/.bashrc
source ~/.bashrc
# optional: for unixODBC development headers
sudo apt-get install -y unixodbc-dev
โ05-01-2024 02:43 AM
Hi @Ameshj ,
I do not see any attachments and links in this post.
However I will add the document which I feel should help you https://docs.databricks.com/en/_extras/documents/aws-init-workspace-files.pdf
If you can specify the specific error that you are facing, we can guide you better.
Thanks!
โ05-01-2024 09:14 PM
Hello - thank you for your reply.
I have only gone as far as creating a new folder and copied the generate init file over.
where it says copy from dbfs - I am unable to as dbfs pathway nomore exists.
Regards
โ05-01-2024 09:52 PM
โ05-01-2024 02:45 AM
There's also this KB specific to init script migration - https://kb.databricks.com/clusters/migration-guidance-for-init-scripts-on-dbfs
โ05-01-2024 09:40 PM
hi
attached the global init script migration - but it does not work either.
dbutils.fs.put(
"/databricks/scripts/pyodbc-install.sh","""
#!/bin/bash
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
curl https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/prod.list > /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get install -y msodbcsql17
# optional: for bcp and sqlcmd
sudo ACCEPT_EULA=Y apt-get install -y mssql-tools
echo 'export PATH="$PATH:/opt/mssql-tools/bin"' >> ~/.bashrc
source ~/.bashrc
# optional: for unixODBC development headers
sudo apt-get install -y unixodbc-dev""",
True)
it says script exit status is non-zero
โ05-05-2024 10:36 AM
Hi @Ameshj ,
Sorry for the delay in the response.
For the all_df screenshot - how are you creating that df? Does it contain Tablename? How is it related to init script migration?
Kindly add set -x
after the first line, and enable cluster logs to DBFS and share the logs if possible.
Thanks & Regards,
Nandini
โ05-06-2024 06:37 AM
One of the other suggestions is to use Lakehouse Federation. It is possible it may be a driver issue (we will get to know from the logs)
โ05-07-2024 09:31 PM
โ05-21-2024 02:18 AM
#!/bin/bash
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
curl https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/prod.list > /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get install -y msodbcsql17
# optional: for bcp and sqlcmd
sudo ACCEPT_EULA=Y apt-get install -y mssql-tools
echo 'export PATH="$PATH:/opt/mssql-tools/bin"' >> ~/.bashrc
source ~/.bashrc
# optional: for unixODBC development headers
sudo apt-get install -y unixodbc-dev
โ08-06-2024 09:50 PM
wow, you are a magician in this.
Thank you for your guidance and assistance. It Worked finally.
a week ago
Hi @Ameshj , can you please "accept as solution" what worked for you.
a week ago
Glad it worked and helped you.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group