Hi Folks,
Since databricks is now asking to use DBX instead of databricks-connect ,We are trying to set up our local environment following the guide.
dbx by Databricks Labs - Azure Databricks | Microsoft Learn
Have create conf/deployment.yml and dbx/project.json and added to root folder of my repo.
Deployment file looks like below
build:
python: "pip"
environments:
default:
workflows:
- name: "dbx-xxxx-job"
spark_python_task:
python_file: "C:\framework\library\py\platformdata\tests\test_dbutils.py"
And I am getting below error while running
dbx execute workflowname -- cluster name =""
Any suggestion here?
│
│ C:\Python\lib\site-packages\yaml\scanner.py:1149 in scan_flow_scalar │
│ │
│ 1146 │ │ start_mark = self.get_mark() │
│ 1147 │ │ quote = self.peek() │
│ 1148 │ │ self.forward() │
│ ❱ 1149 │ │ chunks.extend(self.scan_flow_scalar_non_spaces(double, start_mark)) │
│ 1150 │ │ while self.peek() != quote: │
│ 1151 │ │ │ chunks.extend(self.scan_flow_scalar_spaces(double, start_mark)) │
│ 1152 │ │ │ chunks.extend(self.scan_flow_scalar_non_spaces(double, start_mark)) │
│ │
│ C:\Python\lib\site-packages\yaml\scanner.py:1223 in scan_flow_scalar_non_spaces │
│ │
│ 1220 │ │ │ │ │ self.scan_line_break() │
│ 1221 │ │ │ │ │ chunks.extend(self.scan_flow_scalar_breaks(double, start_mark)) │
│ 1222 │ │ │ │ else: │
│ ❱ 1223 │ │ │ │ │ raise ScannerError("while scanning a double-quoted scalar", start_ma │
│ 1224 │ │ │ │ │ │ │ "found unknown escape character %r" % ch, self.get_mark()) │
│ 1225 │ │ │ else: │
│ 1226 │ │ │ │ return chunks