Tuesday, April 04, 2006

DB2 Process Architecture

  • DB2 Process Architecture

"Architect your passion to achieve..."

The following figure shows a general overview of the architecture and processes for DB2 UDB.



Local clients communicate using

  • shared memory
  • semaphores


Rremote clients use a protocol such as

  • Named Pipes (NPIPE)
  • TCP/IP
  • NetBIOS
  • SNA

The DB2 agents perform most of the SQL processing on behalf of applications. A set of subagents might be assigned to process the client application requests. All agents and subagents are managed using a pooling algorithm that minimizes the creation and destruction of EDUs(Engine Dispatchable Units) / Agents.

Buffer pools are areas of database server memory where database pages of user table data, index data, and catalog data are temporarily moved and can be modified. Buffer pools are a key determinant of database performance because data can be accessed much faster from memory than from disk. If more of the data needed by applications is present in a buffer pool, less time is required to access the data than to find it on disk.

The configuration of the buffer pools, as well as prefetcher and page cleaner EDUs, controls how quickly data can be accessed and how readily available it is to applications.

  • Prefetchers retrieve data from disk and move it into the buffer pool before applications need the data. For example, applications needing to scan through large volumes of data would have to wait for data to be moved from disk into the buffer pool if there were no data prefetchers. Agents of the application send asynchronous read-ahead requests to a common prefetch queue. As prefetchers become available, they implement those requests by using big-block or scatter-read input operations to bring the requested pages from disk to the buffer pool. If you have multiple disks for storage of the database data, the data can be striped across the disks. Striping data lets the prefetchers use multiple disks at the same time to retrieve data.
  • Page cleaners move data from the buffer pool back out to disk. Page cleaners are background EDUs that are independent of the application agents. They look for pages from the buffer pool that are no longer needed and write the pages to disk. Page cleaners ensure that there is room in the buffer pool for the pages being retrieved by the prefetchers.

Without the independent prefetchers and the page cleaner EDUs, the application agents would have to do all of the reading and writing of data between the buffer pool and disk storage.

Friday, April 22, 2005

  • Configuring DB2

"Configure your mind to do extra ordinary things in simple ways"

The configuration of the DB2 and its environment can be done at different levels

1) DB2 Database Manager configuration
2) Database configuration
3) DB2 Client configuration
4) DB2 Fedarated Database configuration



Database manager configuration parameters are stored in a file named db2systm. Database configuration parameters are stored in a file named SQLDBCON. These files cannot be directly edited, and can only be changed or viewed via a supplied API or by a tool which calls that API.



Sunday, April 17, 2005

  • DB2Look

    "LOOKing forward with hope is the first step in the Journey of Life"

    DB2 is a "DB2 Statistics and DDL Extraction Tool". Extracts the required DDL statements to reproduce the database objects of a production database on a test database. This tool can also generate the required UPDATE statements used to replicate the statistics on the objects in a test database, as well as the update database configuration and update database manager configuration parameters and the db2set statements so that the registry variables and configuration parameter settings on the test database match those of the production database.

    Authorization
    SELECT privilege on the system catalogs.

    Sample

    db2look -d department -u Prabhu -t staff -e -o db2look.sql

    The above example generates the DDL statements for table STAFF created by user Prabhu in database DEPARTMENT. The db2look output is sent to file db2look.sql .


    Important Parameters for DB2Look Command

    -d DBname : Database Name / Alias Name of the database from where the statistics and DDL are to be generated.

    -u Creator : Creator ID. Limits output to objects with this creator id.

    -a All Objects : When this option is specified the output is not limited to the objects created under a particular creator ID.

    -h Display help information: When this option is specified, all other options are ignored, and only the help information is displayed.

    -r No RUNSTATS : When this option is specified in conjunction with the -m option, db2look does not generate the RUNSTATS command. The default action is to generate the RUNSTATS command

    -c No COMMIT Statements : When this option is specified in conjunction with the -m option, db2look does not generate COMMIT, CONNECT and CONNECT RESET statements. The default action is to generate these statements. The -c option is ignored if the -m option is not specified.

    -t Tname : Table name. Limits the output to a particular table.

    -o OutputFileName : If this option is not specified, output is written to standard output.

    -e Extract DDL statements : Extract DDL statements for database objects.

    -m : Generate the required UPDATE statements to replicate the statistics on tables, columns and indexes. The -p, -g, and -s options are ignored when the -m option is specified.


    -l : If this option is specified, then the db2look utility will generate DDL for user defined table spaces, nodegroups and buffer pools. DDL for the following database objects is extracted when using the -l option:
  • User defined table spaces
  • User defined nodegroups
  • User defined buffer pools

    -x Generate Security Options : If this option is specified, the db2look utility will generate authorization DDL (GRANT statement, for example).

    -i userid : Use this option when working with a remote database.

    -w password : Used with the -i option. The user ID and the password are used by db2look to logon to the remote system.

    -f Config Parameters : Use this option to extract configuration parameters and registry variables. Note: Only configuration parameters and registry variables that affect the DB2 query optimizer are extracted.


    Example

    db2look -d department -u walid -e -m -o db2look.sql

    Generates both the DDL statements for the objects created by user walid and the UPDATE statements to replicate the statistics on the tables and indexes created by the same user. The db2look output is sent to file db2look.sql


    Referance : http://publib.boulder.ibm.com/infocenter/db2luw/v8/index.jsp

Wednesday, April 13, 2005

DB2 Import

"Exporting your worries is the passport to IMPORT happiness"

PS Note : First read DB2 Export .

Inserts data from an external file with a supported file format into a table, hierarchy, or view. A faster alternative is LOAD; however, the load utility does not support loading data at the hierarchy level.

Moreover the LOAD utility does not take care of indexes and realtions while loading the data and hence the integrity of the tale may be compromised. The import utility uses the INSERT queries , so even though import is slower than the LOAD the integrity of the tables are preserved.

db2 Import from myfile.ixf of ixf
Messages msg.txt
Insert into staff

The above example shows how to import information from myfile.ixf to the STAFF table.

Some of the parameters for Import Utility



  • COMMITCOUNT n
    Performs a COMMIT after every n records are imported. It is not applicable when importing through DB2 Connect.

  • CREATE
    Creates the table definition and row contents. Applicable only during import of IXF file.

  • IN tablespace-name
    Identifies the table space in which the table will be created if REPLACE_CREATE, CREATE parameter is used.

  • INDEX IN tablespace-name
    Identifies the table space in which any indexes on the table will be created. This option is allowed only when the primary table space specified in the IN clause is a DMS table space. The specified table space must exist, and must be a REGULAR DMS table space.

  • INSERT
    Adds the imported data to the table without changing the existing table data.

  • INSERT_UPDATE
    Adds rows of imported data to the target table, or updates existing rows (of the target table) with matching primary keys.

  • INTO table-name
    Specifies the database table into which the data is to be imported.

  • LOBS FROM lob-path
    Specifies one or more paths that store LOB files.

  • LONG IN tablespace-name
    Identifies the table space in which the values of any long columns (LONG VARCHAR, LONG VARGRAPHIC, LOB data types, or distinct types with any of these as source types) will be stored.

  • MESSAGES message-file
    Specifies the destination for warning and error messages that occur during an import operation.

  • METHOD
    L : (ASC File only)Specifies the start and end column numbers from which to import data. A column number is a byte offset from the beginning of a row of data. It is numbered starting from 1.
    N :(IXF Files only) Specifies the names of the columns to be imported.
    P : (IXF , DEL Files)Specifies the indexes (numbered from 1) of the input data fields to be imported.

  • NULL INDICATORS null-indicator-list
    This option can only be used when the METHOD L parameter is specified; that is, the input file is an ASC file).

  • OF filetype
    ASC, DEL, WSF, IXF files to which data is already exported.

  • REPLACE
    Deletes all existing data from the table by truncating the data object, and inserts the imported data.

  • REPLACE_CREATE
    If the table exists, deletes all existing data from the table by truncating the data object, and inserts the imported data without changing the table definition or the index definitions.

  • RESTARTCOUNT n
    Specifies that an import operation is to be started at record n + 1. The first n records are skipped.

Consider the following Scenario and how import works

Data records in DATAFILE1 (DEL format):

"Liszt"
"Hummel",,187.43, H
"Grieg",100, 66.34, G
"Satie",101, 818.23, I

Data records in DATAFILE2 (DEL format):

"Liszt", 74.49, A
"Hummel", 0.01, H
"Grieg", 66.34, G
"Satie", 818.23, I

The following command generates identity values for rows 1 and 2, since no identity values are supplied in DATAFILE1 for those rows. Rows 3 and 4, however, are assigned the user-supplied identity values of 100 and 101, respectively.

db2 Import from datafile1.del of del
R
eplace into table1

To import DATAFILE1 into TABLE1 so that identity values are generated for all rows, issue one of the following commands:

db2 Import from datafile1.del of del
Method P(1, 3, 4)
Replace into table1 (c1, c3, c4)

db2 Import from datafile1.del of del
M
odified by identityignore
R
eplace into table1


To import DATAFILE2 into TABLE1 so that identity values are generated for each row, issue one of the following commands:

db2 Import from datafile2.del of del
R
eplace into table1 (c1, c3, c4)

db2 Import from datafile2.del of del
M
odified by identitymissing
R
eplace into table1


If DATAFILE1 is imported into TABLE2 without using any of the identity-related file type modifiers, rows 1 and 2 will be inserted, but rows 3 and 4 will be rejected, because they supply their own non-NULL values, and the identity column is GENERATED ALWAYS.

Notes on Import

  • REPLACE, REPLACE_CREATE : Commits are automatically performed when log is full
  • INSERT, INSERT_UPDATE : Roll back is done when log is full.
  • Referential constraints and foreign key definitions are not preserved when creating tables from PC/IXF files. (Primary key definitions are preserved if the data was previously exported using SELECT *.)
  • Data cannot be imported to a system table, a declared temporary table, or a summary table.
  • Views cannot be created through the import utility.

Some Modifiers that can be used in Import with all File formats:

  • compound=x : Uses nonatomic compound SQL to insert the data, and x statements will be attempted each time.
  • generatedignore :This modifier informs the import utility that data for all generated columns is present in the data file but should be ignored.
  • generatedmissing : If this modifier is specified, the utility assumes that the input data file contains no data for the generated columns (not even NULLs), and will therefore generate a value for each row.
  • identityignore : This modifier informs the import utility that data for the identity column is present in the data file but should be ignored.
  • identitymissing : If this modifier is specified, the utility assumes that the input data file contains no data for the identity column (not even NULLs), and will therefore generate a value for each row.
  • lobsinfile : lob-path specifies the path to the files containing LOB values.
  • nodefaults : If a source column for a target table column is not explicitly specified, and the table column is not nullable, default values are not loaded.
  • usedefaults : If a source column for a target table column has been specified, but it contains no data for one or more row instances, default values are loaded.
  • dateformat="x" : x is the format of the date in the source file. "D-M-YYYY" , "MM.DD.YYYY" , "YYYYDDD"
  • implieddecimal : The location of an implied decimal point is determined by the column definition; it is no longer assumed to be at the end of the value. For example, the value 12345 is loaded into a DECIMAL(8,2) column as 123.45, not 12345.00.
  • timeformat="x" : x is the format of the time in the source file. "HH:MM:SS" , "HH.MM TT" , "SSSSS"
  • timestampformat="x" : "YYYY/MM/DD HH:MM:SS.UUUUUU" ... In all the above date/time formats TT indicates AM/PM and UUUUUU represents microseconds since midnight.

Modifiers that can be used in Import with ASC (Non Delimited ASCII file) file format alone

  • nochecklengths : If nochecklengths is specified, an attempt is made to import each row, even if the source data has a column definition that exceeds the size of the target table column.
  • nullindchar=x :x is a single character. Changes the character denoting a null value to x. The default value of x is Y.
  • reclen=x : x is an integer with a maximum value of 32 767. x characters are read for each row, and a new-line character is not used to indicate the end of the row.
  • striptblanks / striptnulls : Truncates any trailing NULLs (0x00 characters) / blank spaces when loading data into a variable-length field. If this option is not specified, NULLs are kept.

Modifiers that can be used in Import with DEL (Delimited ASCII file) file format alone

  • chardelx : x is a single character string delimiter.
  • coldelx : x is a single character column delimiter.
  • datesiso : Date format. Causes all date data values to be imported in ISO format.
  • decplusblank : Plus sign character. Causes positive decimal values to be prefixed with a blank space instead of a plus sign (+).
  • decptx : x is a single character substitute for the period as a decimal point character. The default value is a period (.).

Modifiers that can be used in Import with IXF file format alone

  • forcein : Directs the utility to accept data despite code page mismatches, and to suppress translation between code pages.
  • indexixf : Directs the utility to drop all indexes currently defined on the existing table, and to create new ones from the index definitions in the PC/IXF file.
  • indexschema=schema : Uses the specified schema for the index name during index creation.

Reference : http://publib.boulder.ibm.com/infocenter/db2luw/v8/index.jsp

Saturday, April 09, 2005

DB2 Export

"Laughter and Love are mankind's most valuable Exports" - Wikiquote.com

Exports data from a database to one of several external file formats.

ExportFileFormats
  1. DEL (delimited ASCII format)
  2. WSF (work sheet format), used by Lotus 1-2-3 , Lotus Symphony
  3. IXF (integrated exchange format)
With the IXF format, most of the table attributes, as well as any existing indexes, are saved in
the IXF file, except when columns are specified in the SELECT statement. With this format, the
table can be recreated, while with the other file formats, the table must already exist before
data can be imported into it

Consider the following Example

db2 Export to myfile.del of del
Modified by chardel'' coldel; decpt,
Select * from staff

In the above example the staff table contents are exported to the myfile.del file in the current
directory. You can also change the directory by using the following

db2 Export to c:\DB2Export\myfile.del ...

While exporting the staff table into myfile.del , the strings are delimited by " , Columns are
delimited by ; and decimal point is ,

Exporting LOB Data

The LOB data can be exported to a different location by using the following syntax

db2 Export to myfile.del of del lobs to mylobs
Lobfile lobs1, lobs2 modified by lobsinfile
Select * from emp_photo

Message

The Information , Warning and Error Messages that are generated during the export are sent
to the standard Output by default. Alternatively they can be sent to a file by using the messages
parameter.

db2 Export to awards.ixf of ixf
Messages msgs.txt
Select * from staff where dept = 20


Security


One of the following authorization / Previlege is required to export data from a table.

  • sysadm / dbadm
  • CONTROL / SELECT privilege on each participating table or view.

Some Facts about EXPORT

  • PC/IXF import should be used to move data between databases.
  • The export utility produces a warning message whenever a character column with a length greater than 254 is selected for export to DEL format files.
  • Data Link columns can also be exported
  • Default for character Delimiter is Double Quote , Alternative is Single Quote
  • Default for column Delimiter is Comma , Alternative is Semicolon


Reference

http://publib.boulder.ibm.com/infocenter/db2luw/v8/index.jsp