- Its nothing but " Bulk Loader Utility ".
- With this concept we can load the data to the table in bulk.
- Main word is LOAD.
- Then comes to your mind , what is the difference between load and insert ? Insert happens one by one.Load happens in one go.
- what data ? which table ? loading script ? Execute --> these are the four things YOU NEED TO KEEP IN MIND.
- Flat files --> csv ( comma separated value ) , txt , dat , excel , etc.
- Always use notepad to load the data.
select employee_id || ',' || first_name || ',' || salary from employees_table where rownum <= 10 ; --> this will fetch only 10 rows.
- save this file in a folder as csv.
select employee_id || ',' || first_name || ',' || salary from employees_table where employee_id between 150 and 170 ; --> this will fetch rows between those values.
- Save this file as txt.
Now coming to table creation
create table sample(id number , name varchar2(25) , salary number);
Now coming to creation of script
- loading script or control file both are same.
load data infile 'path_of_the_file.csv'
infile 'path_of_the_file.txt'
insert into table sample
fields terminated by ','
(id,name,salary)
- create the script and save as ALL FILES ( notepad ) with .ctl file.
Now coming to Execute
sqlldr hr_schema_name/password control='file_location_of_control_file_or_execution_file' direct = true
- here why direct=true --> it will load very fast and it will by-pass all constraints and triggers.
- if direct=false --> constraints and triggers it will check and then it execute.
- In short ,
Excluding one column
- If you some column should not be loaded , then use FILLER.
load data infile 'path_of_the_file.csv'
infile 'path_of_the_file.txt'
insert into table sample
fields terminated by ','
(id,name,salary filler)
load data infile 'path_of_the_file.csv'
infile 'path_of_the_file.txt'
insert into table sample
fields terminated by ','
(id,name filler,salary)
- In above example , salary and name will be empty . It won't load the data.
Condition
- WHEN --> loading data should obey the condition which you give. If the condition fails , then it stores the failed data in DISCARD FILE.
- If there is Oracle error , then it gets captured in BAD FILE.
- WHEN condition should be used here,
load data infile 'path_of_the_file.csv'
infile 'path_of_the_file.txt'
insert into table sample when ?
fields terminated by ','
(id,name filler,salary)
How to get the process summary ?
- It will be stored in log file.
- you can set all the files in the command itself , like below.
sqlldr hr_schema_name/password control='file_location_of_control_file_or_execution_file' log = summary.log bad = sample.bad discard = sample.dsc direct = true
- If you are giving any file name here , then it will generate automatically.
- So Import take here is ,
skip
- If you want to skip the rows while loading , then you can specify in the command itself.
sqlldr hr_schema_name/password control='file_location_of_control_file_or_execution_file' skip = 2 direct = true
- 2 rows will be skipped.
Notes
- SQL loader short key word is sqlldr.
- insert into table sample --> this will work only when the table is EMPTY. If you try to execute again , then it throw below error.
so you can use ,
load data infile 'path_of_the_file.csv'
infile 'path_of_the_file.txt'
append into table sample
fields terminated by ','
(id,name,salary)
- Also you use truncate ( it will delete old data and insert new data again )
load data infile 'path_of_the_file.csv'
infile 'path_of_the_file.txt'
truncate into table sample
fields terminated by ','
(id,name,salary)
Task
- For a particular column instead of (,) separated it's used as (#) - how to load ?
- how to load the excel file ?
Top comments (0)