I have been asked yesterday how to read and write blobs in the database.
With java :
read from an input stream
InputStream myBlobInputStream = connection . createStatement() . executeQuery("select myBlob from t") . getBlob(1) . getBinaryStream();
write to an output stream
OutputStream myBlobStream = connection . createStatement() . executeQuery("select myBlob from t for update") . getBlob(1) . getBinaryOutputStream();
where connection is your java.sql.connection.
You could also use PL/SQL and the DBMS_LOB API. There is a complete book (306 pages!) in the doc about working with large objects : Application Developer’s Guide – Large Objects.
Sometimes, you can use plain SQL.
SQL> create table t(x BLOB); Table created SQL> insert into t values(utl_raw.cast_from_number(1)); 1 row inserted SQL> select utl_raw.cast_to_number(x) from t; UTL_RAW.CAST_TO_NUMBER(X) ------------------------- 1
A smart move may be to use SQL Loader. You can specify one file per row
LOAD DATA INFILE '/tmp/x.txt' INTO TABLE "T" (name filler char(255), x lobfile(name) terminated by EOF)
and your import file /tmp/x.txt will look like
x.gif y.gif
but you could also load a 10000 long characters column from your input file in a CLOB column, just by specifying VARCHARC(4,10000) as a datatype
Dude, start using the PRE tag for your code. I looked at this post and thought I’d gone dyslexic. 🙂
Cheers
Tim…
Thanks..always looking for code like this..
ok I do <PRE> instead of <CODE>, I hope you like it
Laurent,
Also a temporary BLOB can be created when using java:
BLOB myBlob = BLOB.createTemporary(connection, false, BLOB.DURATION_SESSION);
This might come in handy not to create a record with an empty blob first.
Regards,
Remco
Thanks Remco for the update
How about using PL/SQL to load Multimedia LOBs delivered from the web? I have an application that is doing this, and it is working just fine, but requires a commit after EACH file being loaded (because of the SELECT…FOR UPDATE). Fine for a single-user environment – sometimes OK for a 5 user environment – but once I have 100 users uploading files at the same time the COMMITs go through the roof and obviously the frequent commits are not best practice.
I’ve tried SQL*Loader but it’s not consistent enough in such a dynamic environment. PL/SQL is more consistently accurate, but slow as death.
Any suggestions? Or is this a “c’est la vie” situation.