You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was importing large JSON objects with many fields and duckdb refused to expand each field into its own column and instead give me a single STRUCT... or MAP(VARCHAR, VARCHAR) column. When messing around with the parameters I stumbled across this list of available arguments:
D SELECT * FROM read_json('./test-flat.json', format='array', auto_detectt=true) LIMIT 1;
Binder Error: Invalid named parameter "auto_detectt" for function read_json
Candidates:
map_inference_threshold BIGINT
records VARCHAR
timestampformat VARCHAR
field_appearance_threshold DOUBLE
date_format VARCHAR
dateformat VARCHAR
sample_size BIGINT
columns ANY
convert_strings_to_integers BOOLEAN
format VARCHAR
ignore_errors BOOLEAN
maximum_object_size UINTEGER
maximum_depth BIGINT
auto_detect BOOLEAN
union_by_name BOOLEAN
maximum_sample_files BIGINT
compression VARCHAR
timestamp_format VARCHAR
hive_types ANY
hive_partitioning BOOLEAN
hive_types_autocast BOOLEAN
filename ANY
It seems like not all of them are documented, the one I needed in my case was map_inference_threshold which apparently has a default which is too low for my objects, after setting it to -1 the columns got generated. Searching via GitHub also yields no results.
I was importing large JSON objects with many fields and duckdb refused to expand each field into its own column and instead give me a single
STRUCT...
orMAP(VARCHAR, VARCHAR)
column. When messing around with the parameters I stumbled across this list of available arguments:It seems like not all of them are documented, the one I needed in my case was
map_inference_threshold
which apparently has a default which is too low for my objects, after setting it to-1
the columns got generated. Searching via GitHub also yields no results.Page URL: https://duckdb.org/docs/data/json/loading_json.html
The text was updated successfully, but these errors were encountered: