-
1. Re: SOS - Cannot insert blob data more than 2k for Oracle 8i
gaetanzoritchak Mar 6, 2002 10:30 AM (in response to ivatury)Try the OCI driver. I add the same kind of problems with the thin driver.
-
2. Re: SOS - Cannot insert blob data more than 2k for Oracle 8i
ivatury Mar 6, 2002 10:44 AM (in response to ivatury)I know it can be done safely using the OCI driver , but im developing a product which will support multiple databases , and my initial customer has oracle 8i on linux.so i have no choice but to use a JBDC complaint driver .I have looked at other drivers but they cannot be bundled with my product as my product cost is pretty less than their driver distribution cost :(
Outside from any stand-alone program it is doing fine , coz we have done the same usig enhydra.I was able to upload upto 100 MB into the BLOB , through EJB its not happenning.
regards
Bhushan -
3. Re: SOS - Cannot insert blob data more than 2k for Oracle 8i
droman Mar 6, 2002 11:54 AM (in response to ivatury)Hi,
I also had this problem once (not only in EJB context). The point was a bug in Oracle's JDBC Driver that the data pieces which can be writen at once have to be smaller or equal to the chunk size of BLOB objects. So we have developed this helper class to avoid this problem:
import java.io.*;
import java.sql.*;
import java.util.*;
import oracle.sql.*;
import oracle.jdbc.driver.*;
/**
* The BlobInputStream serves as a (temp) replacement of the binary
* input stream that can be acquired through the "getBinaryInputStream"
* method. This is done to speed up BLOB data trafic. The regular stream
* seems to implement the stream inefficiently. The class is an extend
* of java.io.InputStream
*
* @see java.io.InputStream
* @version 1.0
*/
public class BlobInputStream extends InputStream
{
private BLOB blob = null;
private long blob_pos;
private long blob_size;
private long blob_avail;
private byte[] buffer = null;
private int buf_pos;
private int buf_size;
private int ch;
private int refcnt = 0;
private int read_bytes = 0;
public BlobInputStream ( Blob theBlob ) throws SQLException, IOException
{
// register the given blob to the private variable
this.blob = (BLOB)theBlob;
// get the size of the blob
blob_size = blob.length();
// init amount of blob data available so far
blob_avail = blob_size;
// init blob read position
blob_pos = 1;
// initialize buffer size
// Bug in JDBC Driver!!! Buffer size must be equal to
// chunk size
buf_size = blob.getChunkSize();
// allocate buffer
buffer = new byte[buf_size];
// initialize position variable
buf_pos = 0;
// fill the buffer for the first time
refill();
}
/*
* (re)fill the buffer with blob content
*
*/
void refill() throws IOException
{
refcnt++;
try
{
// if there is blob data available
if ( blob_avail > 0 )
{
// if it is enough to fit the whole buffer
if ( blob_avail > buf_size )
{
// give new content to buffer
blob.getBytes(blob_pos,buf_size,buffer);
// update the blob position to next read position
blob_pos += buf_size;
// decrease the available amount of bytes to read
blob_avail -= buf_size;
}
else // there is not enough to fill the whole buffer
{
// stuff remaining blob content in buffer
blob.getBytes(blob_pos,(int)blob_avail,buffer);
blob_pos += blob_avail - 1;
if ( blob_pos != blob_size )
{
throw new IOException();
}
// no more blob data availabe
blob_avail = 0;
}
// reset the buffer position
buf_pos = 0;
}
else
{
throw new IOException();
}
}
catch ( SQLException e )
{
e.printStackTrace();
throw new IOException();
}
catch ( ArrayIndexOutOfBoundsException iobe )
{
iobe.printStackTrace();
throw new IOException();
}
}
/**
* Override of the InputStream's read method. This method reads a single
* character from the stream a returns it to the caller.
*
* @return The number of bytes read or -1 if the end of the input stream
* has been reached.
* @exception java.io.IOException
*/
public int read() throws IOException
{
if ( blob == null )
{
throw new IOException();
}
if ( buf_pos < buf_size )
{
ch = (int)buffer[buf_pos++] & 0xff;
}
else // at end of read buffer
{
// are there refills left?
if ( blob_avail > 0 )
{
refill();
ch = (int)buffer[buf_pos++] & 0xff;
}
else // end of buffer and no more refills
{
ch = -1;
}
}
return ch;
}
/**
* Override of the InputStream's close method. It will null all it's private
* variables so the 'gc' can reclaim the used buffer space.
*/
public void close()
{
blob = null;
blob_size = 0;
blob_pos = 0;
blob_avail = 0;
buffer = null;
buf_size = 0;
buf_pos = 0;
}
/**
* This method is not (yet) overridden. I just calls the super class version.
* Future versions will have an implementation to further speed up the
* streamer.
*
* @param n the number of positions to skip in the stream
* @see java.io.InputStream
* @exception java.io.IOException
*/
public long skip ( long n ) throws IOException
{
return super.skip(n);
}
/**
* This method is not (yet) overridden. I just calls the super class version.
* Future versions will have an implementation to further speed up the
* streamer.
*
* @param b[] The destination buffer to read data in
* @see java.io.InputStream
* @exception java.io.IOException
*/
public int read ( byte b[], int off, int len ) throws IOException
{
return super.read(b,off,len);
}
/**
* This method is not (yet) overridden. I just calls the super class version.
* Future versions will have an implementation to further speed up the
* streamer.
*
* @see java.io.InputStream
* @exception java.io.IOException
*/
public int read ( byte b[]) throws IOException
{
return super.read(b);
}
}
The usage of the class is really simple:
a. Obtain your BLOB.
b. Create BlobOutputStream for this BLOB.
c. Write whatever your want in the stream.
d. Close the stream, commit, close statement & connection.
I hope it will work for you. Be carefull it works only with Oracle 8.1.6 - 8.1.7, I've never tested this stuff with 9i.
Dmitri. -
4. Re: SOS - Cannot insert blob data more than 2k for Oracle 8i
dsundstrom Mar 6, 2002 12:24 PM (in response to ivatury)> The usage of the class is really simple:
> a. Obtain your BLOB.
> b. Create BlobOutputStream for this BLOB.
> c. Write whatever your want in the stream.
> d. Close the stream, commit, close statement & connection
I have seen other patches like this, but the problem I have implementing it is step a. Where do I get the blob object?
I am given a byte array and told to store it in parameter k of a prepared statement. Where do I get the initial blob object.
BTW, there is another thread on this somewhere in the forums, and they suggest using set blob instead of the input stream stuff. The solution creates a implementation of blob, ByteArrayBlob, to pass to setBlob. -
5. Re: SOS - Cannot insert blob data more than 2k for Oracle 8i
ivatury Mar 7, 2002 4:02 AM (in response to ivatury)I got it working today when i got this peice of code , when i posted in experts-exchange.com (By Tim Yates), Now im able to upload upto 100 MB files without any problem with BLOB.
Im posting the code he had sent to me comments are invited over this
Bhushan
Here is the code
--------------------------------------------------------#Start of Tim's code-------------------------------------
This is the ejbStore method out of our working JBoss project, so I hope it helps:
---------------------------------
public void ejbStore()
{
if( !dirty ) return ;
super.ejbStore();
Connection connection = null;
PreparedStatement statement = null;
try
{
connection = dataSource.getConnection();
connection.setAutoCommit( false );
statement = connection.prepareStatement("UPDATE DISCUSSION SET XML = empty_clob() WHERE DISCUSSIONID
= ?");
statement.setString(1, discussionid);
if (statement.executeUpdate() < 1)
{
throw new NoSuchEntityException("Row does not exist");
}
statement.close();
statement = null;
// Ok, now we have added the row, we can set the clob data....
// 1: lock the row...
statement = connection.prepareStatement( "SELECT XML FROM DISCUSSION WHERE DISCUSSIONID = ? FOR
UPDATE" ) ;
statement.setString( 1, discussionid );
ResultSet rslt = statement.executeQuery() ;
if( rslt.next() )
{
CLOB cl = DSName.getOracleResultSet( rslt ).getCLOB( 1 ) ;
Writer wr = cl.getCharacterOutputStream() ;
wr.write( xml.toString() );
wr.close();
}
rslt.close();
rslt = null ;
statement.close() ;
statement = null ;
connection.close();
connection = null;
dirty = false ;
}
catch( IOException e )
{
throw new EJBException( "Arse... stream error... data was not updated... oh well... : " + e.toString()
);
}
catch(SQLException e)
{
throw new EJBException("Error executing SQL UPDATE DISCUSSION SET XML = ? WHERE DISCUSSIONID =
?: " + e.toString());
}
finally
{
try
{
if (statement != null)
{
statement.close();
}
}
catch(SQLException e)
{
}
try
{
if (connection != null)
{
connection.close();
}
}
catch(SQLException e)
{
}
}
}
----------------------------------
As you can see, the important bit is that you set the clob column to empty_clob() first, then lock it
for updating, then write to the handle this returns...
...the "dirty" bit is just something we added to stop Entity Beans storing when they had not changed, as this slowed things down a bit... But as it happens, we could probably have just changed our container options -- , well I guess we were learning as we went on too (and still are) :-)
BAH!! Just noticed as well, that this bit:
if( rslt.next() )
{
CLOB cl = DSName.getOracleResultSet( rslt ).getCLOB( 1 ) ;
Writer wr = cl.getCharacterOutputStream() ;
wr.write( xml.toString() );
wr.close();
}
Calls this static method (inside my DSName class)
public static OracleResultSet getOracleResultSet( ResultSet r )
{
ResultSet rslt = ((org.opentools.minerva.jdbc.ResultSetInPool)r).getUnderlyingResultSet() ; // JBOSS
2.2
}
public static OracleResultSet getOracleResultSet( ResultSet r )
{
ResultSet rslt = ((org.opentools.minerva.jdbc.ResultSetInPool)r).getUnderlyingResultSet() ; // JB2.2
return ((OracleResultSet)rslt) ;
}
--------------------------------------------------------#End of Tim's code------------------------------------- -
6. Re: SOS - Cannot insert blob data more than 2k for Oracle 8i
dsundstrom Mar 7, 2002 10:18 AM (in response to ivatury)Oracle is the biggest piece of crap.
This solution is completely unreasonable. There is a patch on sorceforge for BLOBs. Can you try to adopt this solution for CLOBs? The patch is much easier to implement and easier to understand. -
7. Re: SOS - Cannot insert blob data more than 2k for Oracle 8i
ivatury Mar 8, 2002 1:11 AM (in response to ivatury)Could u please let me know where it is on source forge i search whole thru i couldnt find, i know the solution is unreasonable but its work-around for my problem as my product alpha release date is on 11th march
-
9. Re: SOS - Cannot insert blob data more than 2k for Oracle 8i
ivatury Mar 11, 2002 3:55 AM (in response to ivatury)Got it , will try applying this patch - since we had the work-around we have some time to implement the patch
thanx for the link