Oracle8i JDBC Developer's Guide and Reference Release 8.1.5 A64685-01 |
|
This section has the following subsections:
This section describes how the Oracle JDBC drivers handle Java streams for several datatypes. Data streams allow you to read LONG
column data of up to 2 gigabytes. Methods associated with streams let you read the data incrementally.
Oracle JDBC drivers support the manipulation of data streams in either direction between server and client. The drivers support all stream conversions: binary, ASCII, and Unicode. Following is a brief description of each type of stream:
RAW
bytes of the data. This corresponds to the getBinaryStream()
method.
getAsciiStream()
method.
UCS-2
encoding. This corresponds to the getUnicodeStream()
method.
The methods getBinaryStream()
, getAsciiStream()
, and getUnicodeStream()
, return the bytes of data in an InputStream
object. These methods are described in greater detail in Chapter 4, "Oracle Extensions".
When a query selects one or more LONG
or LONG
RAW
columns, the JDBC driver transfers these columns to the client in streaming mode. After a call to executeQuery()
or next()
, the data of the LONG
column is waiting to be read.
To access the data in a LONG
column, you can get the column as a Java InputStream
and use the read()
method of the InputStream
object. As an alternative, you can get the data as a string or byte array, in which case the driver will do the streaming for you.
You can get LONG
and LONG
RAW
data with any of the three stream types. The driver performs NLS conversions for you depending on the character set of your database and the driver. For more information about NLS, see "Using NLS".
A call to getBinaryStream()
returns RAW
data "as-is". A call to getAsciiStream()
converts the RAW
data to hexadecimal and returns the ASCII representation. A call to getUnicodeStream()
converts the RAW
data to hexadecimal and returns the Unicode bytes.
For example, if your LONG
RAW
column contains the bytes 20 21 22, you receive the following bytes:
LONG RAW | BinaryStream | ASCIIStream | UnicodeStream |
---|---|---|---|
20 21 22 |
20 21 22 |
'1' '4' '1' '5' '1' '6' |
'1' '4' '1' '5' '1' '6' |
For example, the LONG
RAW
value 20 is represented in hexadecimal as 14 or "1" "4". In ASCII, 1 is represented by "49" and "4" is represented by "52". In Unicode, a padding of zeros is used to separate individual values. So, the hexadecimal value 14 is represented as 0 "1" 0 "4". The Unicode representation is 0 "49" 0 "52".
When you get LONG
data with getAsciiStream()
, the drivers assume that the underlying data in the database uses an US7ASCII
or WE8ISO8859P1
character set. If the assumption is true, the drivers return bytes corresponding to ASCII characters. If the database is not using an US7ASCII
or WE8ISO8859P1
character set, a call to getAsciiStream()
returns gibberish.
When you get LONG
data with getUnicodeStream()
, you get a stream of Unicode characters in the UCS-2
encoding. This applies to all underlying database character sets that Oracle supports.
When you get LONG
data with getBinaryStream()
, there are two possible cases:
US7ASCII
or WE8ISO8859P1
, then a call to getBinaryStream()
returns UTF-8
. If the client character set is US7ASCII
or WE8ISO8859P1
then the call returns a US7ASCII
stream of bytes.
US7ASCII
or WE8ISO8859P1
, then a call to getBinaryStream()
returns UTF-8
. If the server-side character set is US7ASCII
or WE8ISO8859P1
then the call returns a US7ASCII
stream of bytes.
For more information on how the drivers return data based on character set, see "Using NLS".
Note:
Receiving |
Table 3-3 summarizes LONG
and LONG
RAW
data conversions for each stream type.
One of the features of a getXXXStream()
method is that it allows you to fetch data incrementally. In contrast, getBytes()
fetches all of the data in one call. This section contains two examples of getting a stream of binary data. The first version uses the getBinaryStream()
method to obtain LONG
RAW
data; the second version uses the getBytes()
method.
This Java example writes the contents of a LONG
RAW
column to a file on the local file system. In this case, the driver fetches the data incrementally.
The following code creates the table that stores a column of LONG
RAW
data associated with the name LESLIE:
-- SQL code: create table streamexample (NAME varchar2 (256), GIFDATA long raw); insert into streamexample values ('LESLIE', '00010203040506070809');
The following Java code snippet writes the data from the LESLIE LONG
RAW
column into a file called leslie.gif:
ResultSet rset = stmt.executeQuery ("select GIFDATA from streamexample where NAME='LESLIE'"); // get first row if (rset.next()) { // Get the GIF data as a stream from Oracle to the client InputStream gif_data = rset.getBinaryStream (1);try {FileOutputStream file = null; file = new FileOutputStream ("leslie.gif"); int chunk; while ((chunk = gif_data.read()) != -1) file.write(chunk);} catch (Exception e) {String err = e.toString(); System.out.println(err);} finally {if file != null()file.close();} }
In this example the contents of the GIFDATA
column are transferred incrementally in chunk
-sized pieces between the database and the client. The InputStream
object returned by the call to getBinaryStream()
reads the data directly from the database connection.
This version of the example gets the content of the GIFDATA
column with getBytes()
instead of getBinaryStream()
. In this case, the driver fetches all of the data in one call and stores it in a byte array. The previous code snippet can be rewritten as:
ResultSet rset2 = stmt.executeQuery ("select GIFDATA from streamexample where NAME='LESLIE'"); // get first row if (rset2.next()) {// Get the GIF data as a stream from Oracle to the client byte[] bytes = rset2.getBytes(1); try {FileOutputStream file = null; file = new FileOutputStream ("leslie2.gif"); file.write(bytes);} catch (Exception e) {String err = e.toString(); System.out.println(err);} finally {if file != null()file.close();} }
Because a LONG
RAW
column can contain up to 2 gigabytes of data, the getBytes()
example will probably use much more memory than the getBinaryStream()
example. Use streams if you do not know the maximum size of the data in your LONG
or LONG
RAW
columns.
The JDBC driver automatically streams any LONG
and LONG
RAW
columns. However, there may be situations where you want to avoid data streaming. For example, if you have a very small LONG
column, you might want to avoid returning the data incrementally and instead, return the data in one call.
To avoid streaming, use the defineColumnType()
method to redefine the type of the LONG
column. For example, if you redefine the LONG
or LONG RAW
column as type VARCHAR
or VARBINARY
, then the driver will not automatically stream the data.
If you redefine column types with defineColumnType()
, you must declare the types of all columns in the query. If you do not, executeQuery()
will fail. In addition, you must cast the Statement
object to the type oracle.jdbc.driver.OracleStatement
.
As an added benefit, using defineColumnType()
saves the driver two round trips to the database when executing the query. Without defineColumnType()
, the JDBC driver has to request the datatypes of the column types.
Using the example from the previous section, the Statement
object stmt
is cast to the OracleStatement
and the column containing LONG
RAW
data is redefined to be of the type VARBINARAY
. The data is not streamed; instead, data is returned by writing it to a byte array.
//cast the statement stmt to an OracleStatement oracle.jdbc.driver.OracleStatement ostmt = (oracle.jdbc.driver.OracleStatement)stmt; //redefine the LONG column at index position 1 to VARBINARY ostmt.defineColumnType(1, Types.VARBINARY); // Do a query to get the images named 'LESLIE' ResultSet rset = ostmt.executeQuery ("select GIFDATA from streamexample where NAME='LESLIE'"); // The data is not streamed here rset.next(); byte [] bytes = rset.getBytes(1);
If you use the defineColumnType()
Oracle extension to redefine a CHAR
, VARCHAR
, or RAW
column as a LONGVARCHAR
or LONGVARBINARY
, then you can get the column as a stream. The program will behave as if the column were actually of type LONG
or LONG RAW
. Note that there is not much point to this, because these columns are usually short.
If you try to get a CHAR
, VARCHAR
, or RAW
column as a data stream without redefining the column type, the JDBC driver will return a Java InputStream
, but no real streaming occurs. In the case of these datatypes, the JDBC driver fully fetches the data into an in-memory buffer during a call to executeQuery()
or next()
. The getXXXStream()
entry points return a stream that reads data from this buffer.
If your query selects multiple columns and one of the columns contains a data stream, then the contents of the columns following the stream column are usually not available until the stream has been read. This is because the database sends each row as a set of bytes representing the columns in the SELECT
order: the data after a streaming column can be read only after the stream has been read.
For example, consider the following query:
ResultSet rset = stmt.executeQuery ("select DATECOL, LONGCOL, NUMBERCOL from TABLE"); while rset.next() {//get the date data java.sql.Date date = rset.getDate(1);// get the streaming data InputStream is = rset.getAsciiStream(2); // Open a file to store the gif data FileOutputStream file = new FileOutputStream ("ascii.dat"); // Loop, reading from the ascii stream and // write to the file int chunk; while ((chunk = is.read ()) != -1)file.write(chunk);// Close the file file.close(); //get the number column data int n = rset.getInt(3); }
The incoming data for each row has the following shape:
<a date><the characters of the long column><a number>
When you call rset.next()
, the JDBC driver stops reading the row data just before the first character of the LONG
column. Then the driver uses rset.getAsciiStream()
to read the characters of the LONG
column directly out of the database connection as a Java stream. The driver reads the NUMBER
data from the third column only after it reads the last byte of the data from the stream.
An exception to this behavior is LOB data, which is also transferred between server and client as a Java stream. For more information on how the driver treats LOB data, see "Streaming LOBs and External Files".
There might be situations where you want to avoid reading a column that contains streaming data. If you do not want to read the data for the streaming column, then call the close()
method of the stream object. This method discards the stream data and allows the driver to continue reading data for all the non-streaming columns that follow the stream. Even though you are intentionally discarding the stream, it is good programming practice to call the columns in SELECT
list order.
In the following example, the stream data in the LONG
column is discarded and the data from only the DATE
and NUMBER
column is recovered:
ResultSet rset = stmt.executeQuery ("select DATECOL, LONGCOL, NUMBERCOL from TABLE"); while rset.next(){//get the date java.sql.Date date = rset.getDate(1); //access the stream data and discard it with close() InputStream is = rset.getAsciiStream(2); is.close(); //get the number column data int n = rset.getInt(3);}
This section describes some of the precautions you must take to ensure that you do not accidentally discard or lose your stream data. The drivers automatically discard stream data if you perform any JDBC operation that communicates with the database, other than reading the current stream. Two common precautions are described in the following sections:
To recover the data from a column containing a data stream, it is not enough to get
the column; you must read and store its contents. Otherwise, the contents will be discarded when you get the next column.
If your query selects multiple columns, the database sends each row as a set of bytes representing the columns in the SELECT
order. If one of the columns contains stream data, the database sends the entire data stream before proceeding to the next column.
If you do not use the SELECT
list order to access data, then you can lose the stream data. That is, if you bypass the stream data column and access data in a column that follows it, the stream data will be lost. For example, if you try to access the data for the NUMBER
column before reading the data from the stream data column, the JDBC driver first reads then discards the streaming data automatically. This can be very inefficient if the LONG
column contains a large amount of data.
If you try to access the LONG
column later in the program, the data will not be available and the driver will return a "Stream Closed
" error. This is illustrated in the following example:
ResultSet rset = stmt.executeQuery ("select DATECOL, LONGCOL, NUMBERCOL from TABLE"); while rset.next() {int n = rset.getInt(3); // This discards the streaming data InputStream is = rset.getAsciiStream(2); // Raises an error: stream closed.}
If you get the stream but do not use it before you get the NUMBER
column, the stream still closes automatically:
ResultSet rset = stmt.executeQuery("select DATECOL, LONGCOL, NUMBERCOL from TABLE");while rset.next() {InputStream is = rset.getAsciiStream(2); // Get the stream int n = rset.getInt(3); // Discards streaming data and closes the stream} int c = is.read(); // c is -1: no more characters to read-stream closed
If the JDBC driver encounters a column containing a data stream, row prefetching is set back to 1.
You can discard the data from a stream at any time by calling the stream's close()
method. You can also close and discard the stream by closing its result set or connection object. You can find more information about the close()
method for data streams in "Bypassing Streaming Data Columns". For information on how to avoid closing a stream and discarding its data by accident, see "Streaming Data Precautions".
The term large object (LOB) refers to a data item that is too large to be stored directly in a database table. Instead, a locator is stored in the database table and points to the location of the actual data. The JDBC drivers provide support for three types of LOBs: BLOB
s (unstructured binary data), CLOB
s (single-byte character data) and BFILEs (external files). The Oracle JDBC drivers support the streaming of CLOB
, BLOB
, and BFILE
data.
LOBs behave differently from the other types of streaming data described in this chapter. The driver transfers LOB data between server and client as a Java stream. However, unlike most Java streams, a locator representing the LOB data is stored in the table. Thus, you can access the LOB data at any time during the life of the connection.
When a query selects one or more CLOB
or BLOB
columns, the JDBC driver transfers to the client the data pointed to by the locator. The driver performs the transfer as a Java stream. To manipulate CLOB
or BLOB
data from JDBC, use methods in the Oracle extension classes oracle.sql.BLOB
and oracle.sql.CLOB
. These classes provide functionality such as reading from the CLOB
or BLOB
into an input stream, writing from an output stream into a CLOB
or BLOB
, determining the length of a CLOB
or BLOB
, and closing a CLOB
or BLOB
.
For a complete discussion of how to use streaming CLOB
and BLOB
data, see "Reading and Writing BLOB and CLOB Data".
An external file, or BFILE
, is used to store a locator to a file that is outside the database, stored somewhere on the filesystem of the data server. The locator points to the actual location of the file.
When a query selects one or more BFILE
columns, the JDBC driver transfers to the client the file pointed to by the locator. The transfer is performed in a Java stream. To manipulate BFILE
data from JDBC, use methods in the Oracle extension classes oracle.sql.BFILE
. These classes provide functionality such as reading from the BFILE
into an input stream, writing from an output stream into a BFILE
determining the length of a BFILE
, and closing a BFILE
.
For a complete discussion of how to use streaming BFILE
data, see "Reading BFILE Data".