Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Example Analysis of oracle transferable tablespace TTS

2025-05-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article is about the sample analysis of oracle transferable tablespaces TTS. The editor thinks it is very practical, so share it with you as a reference and follow the editor to have a look.

The fastest way for oracle to load data

Limit

The character set of the Source/target database must be compatible. The target character set is either the same as source or a superset of source; database/national character set

Encrypted tablespaces cannot be transferred across endian platforms

Tables containing encrypted columns cannot support transport

Cannot transfer object under system tablespace / sys user

Target version cannot be lower than source

Time zone files need to be consistent

Steps

1 check whether the target/source endian is consistent

All Windows/linux are little and the rest are big.

SELECT d.PLATFORM_NAME, ENDIAN_FORMAT FROM V$TRANSPORTABLE_PLATFORM. Tp, V$DATABASE d WHERE tp.PLATFORM_NAME = d.PLATFORM_NAME

-- source

PLATFORM_NAME ENDIAN_FORMAT

Solaris [tm] OE (32-bit) Big

-- target

PLATFORM_NAME ENDIAN_FORMAT

Microsoft Windows IA (32-bit) Little

2 check whether the tablespace is self-contained

The index / table partition / materialized view to which the base table belongs must be in the same tablespace

EXECUTE DBMS_TTS.TRANSPORT_SET_CHECK ('sales_1,sales_2', TRUE)

SQL > SELECT * FROM TRANSPORT_SET_VIOLATIONS

VIOLATIONS

Constraint DEPT_FK between table JIM.EMP in tablespace SALES_1 and table

JIM.DEPT in tablespace OTHER

Partitioned table JIM.SALES is partially contained in the transportable set

3 copy the data file and export its metadata- first set the tablespace to read-only

SQL > ALTER TABLESPACE sales_1 READ ONLY

Tablespace altered.

SQL > ALTER TABLESPACE sales_2 READ ONLY

Tablespace altered.

Expdp system dumpfile=expdat.dmp directory=data_pump_dir transport_tablespaces=sales_1,sales_2transport_full_check=y logfile=tts_export.log

-- Parameter transport_full_check is used to confirm that the tablespace is self-contained. If the verification fails, the expdp will be terminated.

*

Dump file set for SYSTEM.SYS_EXPORT_TRANSPORTABLE_01 is:

/ u01/app/oracle/admin/salesdb/dpdump/expdat.dmp

*

Datafiles required for transportable tablespace SALES_1:

/ u01/app/oracle/oradata/salesdb/sales_101.dbf

Datafiles required for transportable tablespace SALES_2:

/ u01/app/oracle/oradata/salesdb/sales_201.dbf

After export, you can restore the tablespace to read-write mode.

If target is exadata and is different from source endian, oracle launches cross platform. Incremental backup is used to reduce the impact on source db, see 1389592.1 for details

Copy the data file online, then back it up incrementally and apply it to target until it is nearly synchronized with source

The last step is to set source tb to read only and make the last incremental backup of it, which minimizes the unavailability time of source tb.

4 transfer the dataset to target

If source/target uses ASM, you can use dbms_file_transfer/rman for transport

Step 4 / 5 is interchangeable according to the actual situation.

5 convert endianness

Can be executed on the source/ target side

-- source

RMAN > CONVERT TABLESPACE sales_1,sales_2

2 > TO PLATFORM. 'Microsoft Windows IA (32-bit)'

3 > FORMAT'/ tmp/%U'

Starting conversion at source at 30-SEP-08

Using channel ORA_DISK_1

Channel ORA_DISK_1: starting datafile conversion

Input datafile file number=00007name=/u01/app/oracle/oradata/salesdb/sales_101.dbf

Converteddatafile=/tmp/data_D-SALESDB_I-1192614013_TS-SALES_1_FNO-7_03jru08s

Channel ORA_DISK_1: datafile conversion complete, elapsed time: 00:00:45

Channel ORA_DISK_1: starting datafile conversion

Input datafile file number=00008 name=/u01/app/oracle/oradata/salesdb/sales_201.dbf

Converted datafile=/tmp/data_D-SALESDB_I-1192614013_TS-SALES_2_FNO-8_04jru0aa

Channel ORA_DISK_1: datafile conversion complete, elapsed time: 00:00:25

Finished conversion at source at 30-SEP-08

-- target

Dumpfile is located in the DATA_PUMP_DIR directory, while the data file is located in C:\ TEMP

RMAN > CONVERT DATAFILE'C:\ Temp\ sales_101.dbf','C:\ Temp\ sales_201.dbf'

1 > TO PLATFORM= "Microsoft Windows IA (32-bit)"

2 > FROM PLATFORM= "Solaris [tm] OE (32-bit)"

3 > DB_FILE_NAME_CONVERT='C:\ Temp\','C:\ app\ orauser\ oradata\ orawin\'

4 > PARALLELISM=4

Note: if source/target does not use ASM, you can not specify that source/target platform,RMAN can know source platform by checking the data file, while target platform defaults to the current host

6 Import target

If the tablespace data block is not a standard block for target db, you need to set the db_nk_cache_size of target db

Impdp system dumpfile=expdat.dmp directory=data_pump_dir transport_datafiles=c:\ app\ orauser\ oradata\ orawin\ sales_101.dbf,c:\ app\ orauser\ oradata\ orawin\ sales_201.dbf

Remap_schema=sales1:crm1 remap_schema=sales2:crm2 logfile=tts_import.log

Execute TTS 788176.1 on standby

The physical repository can execute TTS, but first you need to activate it, in general steps

1 ensure that it is synchronized with the main library, stop log transfer and MRP

2 create restore point, execute TTS and roll back

3 activate slave library alter database activate standby database and set the protection mode to maximize performance-alter database set standby database to maximize performance

4 execute TTS

5 flashback to restore point and re-convert to physical repository flashback database to restore point b/alter database convert to physical standby

TTS single table partition 731559.1

Replace a single partition with a swap partition to a new table, then delete the partition and transfer the tablespace

TTS ID 394798.1 based on ASM Stora

Using dbms_file_transfer to transfer dump/data file requires dblink assistance

1 create a dblink:create database link db2 connect to system identified by manager1 using 'db2' pointing to target

2 create ASM-based directory,create directory tts_dump as'+ DATA' in source/target

3 Export metadata

Ora10g@host1] $expdp system/manager1 directory=tts_dump dumpfile=tts1_db1.dmp logfile=tts_dump_log:tts.log transport_tablespaces=tts_1,tts_2 transport_full_check=y

Starting "SYSTEM". "SYS_EXPORT_TRANSPORTABLE_02": system/* directory=tts_datafile dumpfile=tts1.dmp logfile=tts_dump_log:tts.log transport

_ tablespaces=tts_1,tts_2 transport_full_check=y

Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK

Processing object type TRANSPORTABLE_EXPORT/TABLE

Processing object type TRANSPORTABLE_EXPORT/INDEX

Processing object type TRANSPORTABLE_EXPORT/INDEX_STATISTICS

Processing object type TRANSPORTABLE_EXPORT/TABLE_STATISTICS

Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PLUGTS_BLK

Master table "SYSTEM". "SYS_EXPORT_TRANSPORTABLE_02" successfully loaded/unloaded

* * Dump file set for SYSTEM.SYS_EXPORT_TRANSPORTABLE_02 is:+DATA/tts1.dmp

Job "SYSTEM". "SYS_EXPORT_TRANSPORTABLE_02" successfully completed at 14:00:34

4 transfer the dump/data file to target

SQL > begin

2 dbms_file_transfer.put_file

3 (source_directory_object = > 'tts_dump'

4 source_file_name = > 'tts1_db1.dmp'

5 destination_directory_object = > 'tts_dump'

6 destination_file_name = > 'tts1_db1.dmp'

7 destination_database = > 'db2')

8 end

9 /

SQL > begin

2 dbms_file_transfer.put_file

3 (source_directory_object = > 'tts_datafile'

4 source_file_name = > 'tts_1.294.570721319'

5 destination_directory_object = > 'tts_datafile'

6 destination_file_name = > 'tts1_db1.dbf'

7 destination_database = > 'db2')

8 end

9 /

5 Import target

Impdp directory=tts_dump dumpfile=tts1_db1.dmp logfile=tts_dump_log:tts1.log TRANSPORT_DATAFILES='+DATA1/tts1_db1.dbf','+DATA1/tts2_db1.dbf' keep_master=y

Master table "SYSTEM". "SYS_IMPORT_TRANSPORTABLE_03" successfully loaded/unloaded

Starting "SYSTEM". "SYS_IMPORT_TRANSPORTABLE_03": system/* parfile=impdp.par

Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK

Processing object type TRANSPORTABLE_EXPORT/TABLE

Processing object type TRANSPORTABLE_EXPORT/INDEX

Processing object type TRANSPORTABLE_EXPORT/INDEX_STATISTICS

Processing object type TRANSPORTABLE_EXPORT/TABLE_STATISTICS

Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PLUGTS_BLK

Job "SYSTEM". "SYS_IMPORT_TRANSPORTABLE_03" successfully completed at 15:05:00

Thank you for reading! This is the end of this article on "sample Analysis of oracle transferable tablespaces TTS". I hope the above content can be helpful to you, so that you can learn more knowledge. if you think the article is good, you can share it out for more people to see!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report