There is a field nvarchar2 in the table, in which 63 characters are written. In another table there is an nclob in which 829 characters are written.

How much memory does the 63 and 829 characters take?

  • Encoding base (or columns) what. there in the coding is clearly indicated the number of bits per symbol. now for example UTF8AL16 is distributed - 16 bits, i.e. 2 bytes per character - Mike
  • What is meant by memory, disk space? - Nikola Tesla
  • encoding AL32UTF8 , Yes, how much disk space does a given number of characters take? - LocalUser

1 answer 1

Fields with type N VARCHAR2 and N CLOB can only be in AL16UTF16 or UTF8 encodings. The default is AL16UTF16, determined at the database level. Learn installed:

 select * from nls_database_parameters where PARAMETER='NLS_NCHAR_CHARACTERSET' 

One character in the AL16UTF16 encoding takes up to 4 bytes. Accordingly, it turns out that your deadlines take up to 63 * 4 and 829 * 4, depending on the character set.

How many bytes does it occupy the value in the column with the type NVARCHAR2 (not NCLOB) you can find out using the LENGTH B function ( column ).

For NCLOB, it’s harder, you can try (I can be wrong) to read the size in bytes by 4000 characters: LENGTHB(dbms_lob.substr(<NCLOB-Column>,1,4000)) . Here I am somewhat confused by the conversion of encodings from NATIONAL_CHARACTER_SET to CHARACTER_SET, I would not much count on such a conversion.

You can also try NCLOB to convert to BLOB (using dbms_lob.converttoblob ), and then call dbms_lob.getlength .

Well, or to be completely accurate, you can find the number of the data file and the block (using dbms_rowid.rowid_relative_fno(ROWID) , dbms_rowid.rowid_block_number(ROWID) ), make a dump ( alter system dump datafile <номер дата-файла> block <номер блока> ) and look in the USER_DUMP_DEST directory USER_DUMP_DEST file. This method will work for any type.