sql - Transposing and then exporting a table to a CSV file -
i have sql table 3 columns such:
i write script in amazon redshift (uses postgresql 8.0.2 believe) exports above table csv file transposed. transposed mean create new column each cobrand (there 4 distinct values in cobrand_id column) in csv file. illustrate want, included image (the values illustrative!):
when try:
copy temp_08.jwn_calc 'p:/sql_new/products_199.csv' delimiter ',' csv header;
i error: [42601] error: syntax error @ or near "header" position: 74
when remove "csv header", error: [0a000] error: copy file xen-tables not supported
help appreciated!
transposing
to transpose data, you'll have write query names each column, such as:
select qqyy "quarter", sum(case when cobrand_id = 10001372 sum end) "10001372", sum(case when cobrand_id = 10005244 sum end) "10005244", sum(case when cobrand_id = 10005640 sum end) "10005640", sum(case when cobrand_id = 10006164 sum end) "10006164" input_table group qqyy order qqyy
saving
the copy
command in amazon redshift can load data from:
- amazon s3
- amazon dynamodb
- an amazon emr cluster
- a linux host running ssh
if wish load data redshift, should place csv (or zipped csv) amazon s3 bucket , use copy
command import data.
if wish export data redshift, use unload
command created zipped csv files in amazon s3. not possible directly download results redshift via unload command. alternatively, sql client runs locally on computer might have ability save query results file.
the error received due fact attempted access filesystem of redshift host computer (p:/sql_new/products_199.csv
). not permitted, since have no login access host computer.
if have sql query transforms data want, use unload
command export it:
unload ('select...from...') credentials ... 's3://my-bucket/output.csv'
Comments
Post a Comment