sql - Ignore error in batch insert Postgresql -


i have process runs every 5 minutes , tries insert batch of articles table. articles come web-scrapping, there cases in trying insert batch contains articles have been saved db.

my primary key uuid - md5 hash of article title.

checking if article exists in db filter batch kinda inefficient.

is db level way in postgresql ignore attempts of inserting duplicate uuid without returning error?

solution

you insert using where not exists clause.

for example, consider test table numeric id primary key , textual name.

code

db=> create table test(id bigserial primary key, name text); create table  -- insertion work - empty table db=> insert test(id, name)       select 1, 'partner number 1'       not exists (select 1,2 test id=1); insert 0 1  -- insertion not work - duplicate id db=> insert test(id, name)       select 1, 'partner number 1'       not exists (select 1,2 test id=1);     insert 0 0  -- after 2 insertions, table contains 1 row db=> select * test;  id |       name ----+------------------   1 | partner number 1 (1 row) 

difference on confilct

quoting the documentation:

on conflict can used specify alternative action raising unique constraint or exclusion constraint violation error.

the action can be do nothing, or do update. second approach referred upsert - portmanteau of insert , update.

technically where not exists equivalent on confilct nothing. see query plans deeper dive.


Comments

Popular posts from this blog

php - How to display all orders for a single product showing the most recent first? Woocommerce -

asp.net - How to correctly use QUERY_STRING in ISAPI rewrite? -

angularjs - How restrict admin panel using in backend laravel and admin panel on angular? -