cancel
Showing results for 
Search instead for 
Did you mean: 
N_M
Contributor
since ‎10-10-2023
‎10-07-2024

User Stats

  • 13 Posts
  • 2 Solutions
  • 1 Kudos given
  • 2 Kudos received

User Activity

Hello CommunityI'm using the for_each tasks in workflows, but I'm struggling to access to the job information through the job APIs.In short, using the runs api (Get a single job run | Jobs API | REST API reference | Databricks on AWS), I'm able to ac...
Hi CommunityI made some research, but I wasn't lucky, and I'm a bit surprised I can't find anything about it.So, I would simply access the job parameters when using python scripts (not notebooks).My flow doesn't use notebooks, but I still need to dri...
HiI'm using the COPY INTO command to insert new data (in form of CSVs) into an already existing table.The SQL query takes care of the conversion of the fields to the target table schema (well, there isn't other way to do that), and schema update is n...
Dear Community,I'm using the COPY INTO command to automate the staging of files that I get in an S3 bucket into specific delta tables (with some transformation on the fly).The command works smoothly, and files are indeed inserted only once (writing i...
Hi all,Due to file size and file transfer limitation, we are receiving huge files compressed and split, in the format    FILE.z01, FILE.z02,...,FILE.zipHowever, I can't find a way to unzip multipart files using databricks.I tried already some of the ...
Kudos from
Kudos given to