Oracle Streams – Section2
The basic capture process
If the data is modified in the ORACLE database, it will be record into the redolog, so that when an error occurs, to have a way of correction. The Capture process as a background process oracle redolog has the ability to read, so it also has the capture DML and DDL ability to modify operations.
Capture process will be changed to the specified format, the message is stored and defined as LCRs into the queue. Since the operation of the capture process automatically
have its own rules based on the captured changes, it is also known as: implicit capture.
What is the LCR
Capture process captures database operations, such as a table/schema modifications, etc. or even the entire database. These changes are recorded in the redo, and capture process is based redolog analysis database format changes and save it as a message, the message to be known: logical change record (LCR). The capture process by defining the rule to determine which changes will be captured, the captured modifications referred captured messages.
There are 2 types of LCR: row LCR and DDL LCR
row LCR: including DML operation modifies information generated attention because the single dml sql statement is also likely to trigger modify multiple records, so a dml modification operations may also have a plurality of row LCR, in addition to a single row of large field types of modifications such as long , lob may also have a plurality of row LCR;
Each rowLCR is packaged into an object type LCR $ _ROW_RECORD, comprising the following properties:
- source_database_name: trigger database modification operations
- command_type: trigger command to modify the type of operation, such as: INSERT, UPDATE, DELETE, LOB ERASE, LOB WRITE, or LOB TRIM.
- object_owner: Object owner
- object_name: Object Name
- Tag: tag, it can be used to track the LCR
- transaction_id: trigger modification DML statements to your transaction ID
- Scn: SCN (system change number) is modified when
- old_values: DML modify values before, but note that different dml operations produce different values, such as the old value is the value before the change, UPDATE or DELETE, and then the column is empty for INSERT
- new_values: DML modified value, based on the same reasons, different dml operations will produce different values, such as values UPDATE or INSERT, that is modified, and DELETE operations this column is empty
DDL LCR: operation including object ddl generated modification information, DDL LCR includes the following information:
- source_database_name: with rowLCR
- command_type: with rowLCR
- object_owner: with rowLCR
- object_name: with rowLCR
- object_type: object types, such as TABLE / VIEW / PACKAGE
- ddl_text: DDL statement executed
- logon_user: user execute DDL statements
- current_schema: execute DDL statements schema
- base_table_owner: the owner of the base table (if, of course, usually not, but for some operations, such as changes trigger to fire, then that is a base table trigger table)
- base_table_name: base table name, other ibid.
- tag: with rowLCR
- transaction_id: with rowLCR
- scn: with rowLCR
Whether rowLCRs or DDL LCRs includes the source database name, in order to avoid problems when propagation or apply, ORACLE recommended that it should not modify the source database name.
Capture process which can capture local downstream capture process, it is also known as downstream capture database can even configure local capture and downstream capture in same time.
Downstream capture can be more complicated, because it involves the locally generated redologs transmitted to the remote execution of the database captures and therefore have to be downstream capture also divided into two categories: real-time downstream capture and archived-log downstream capture
- real-time downstream capture:Redo transport services transmit redo data to the downstream database
through LGWR, the downstream database receives it from RFS(remote file server) processes and
saves it to the standby redolog files. Archivelog process to read from the standby redo and write to archivelog. Capture process
from downstream database can captured the modifications from standby redo. Also can capture the modifidations from the archive redo, of course, priority access from the standby redo.
- Archived-log downstream capture: A variety of ways (redo transport services, ftp, dbms_file_transfer package etc.) to duplicate archivelog file generated by source database, transmission to the downstream database.
Then the capture process in the target database to captures the and generated LCRs, then save to the queue.
Supported data type
- TIMESTAMP WITH TIME ZONE
- TIMESTAMP WITH LOCAL TIME ZONE
- INTERVAL YEAR TO MONTH
- INTERVAL DAY TO SECOND
- LONG RAW
Same as the logic standby, capture does not support following data type
- And user-defined types(including object types, REFs, varrays, nested tables, and Oracle-supplied types)
Application of transparent data encryption column is not supported, capture process captures will trigger an error message when you said does not support the type of file and write trade. By default rule, the capture process automatically disable.
Streams support: INSERT, UPDATE, DELETE, MERGE/LOBS and Modification trigged by UPDATE
- Capture process will merge generated during the conversion insert or update, it is because for rowLCR, MERGE command is not a valid type.
- For the index table can not contain ROWID/UROWID/User-defined types (including object types, REFs, varrays, and nested tables), or Capture process error.
- Capture process ignores such as CALL, EXPLAIN PLAN, or LOCK TABLE command.
- Capture process does not capture temporary table or object table changes.
- Capture process does not capture similar sequence.nextval operation, so for bidirectional replication environment sequence of streams assignment note.
- ALTER DATABASE
- CREATE CONTROLFILE
- CREATE DATABASE
- CREATE PFILE
- CREATE SPFILE
- FLASHBACK DATABASE
Capture process will capture the the DDL statement, but not the result the DDL statement. For example, you execute analyze statement in the source database, capture process will not capture the statistics generated, but only capture the SQL statement. There is only one exception to that is, create table as select such statements, except for this statement is that it captures the statement itself, but also need to capture select columns (as insert rowLCR).