Hi , we are planning to migrate 64TB database which has large tables as few TBs. Can you suggest good migration method to move to AWS rds
@upgradenow6 ай бұрын
Hi, I have no knowledge about the ins and outs of Amazon RDS, so I can't provide a good advice. Sorry. Regards, Daniel
@MukeshSahu-mn2ps6 ай бұрын
16hrs still huge, i have similar requirement where we need to migrate from aix to linux , Database running on 19c with filesystem, db size is 45TB, plan to move to Linux on asm, customer can afford only 1-2hrs or downtime, what technologies would you recommend ?
@upgradenow6 ай бұрын
Hi, This reference project is a few years old and use a former technique. We have a new, improved method for cross-platform migrations that we explain in our webinar "Cross Platform Migration - Transportable Tablespaces to the Extreme". You find the video and slides here: dohdatabase.com/webinars/ For a cross-platform migration of such a large database, I think only Oracle GoldenGate is a viable solution. Transportable Tablespace as shown about might be possible in Oracle Database 21c and later with parallel transportable jobs in Data Pump but only for very simple databases (they might be big, but dictionary must not be complex). Regards, Daniel
@manzambibissengomatila73837 ай бұрын
Hi, Please could you be so kind to give us the maximum characteres or tbs_name seperated with comma in the dbmig_ts_list.txt file? (M5). Thanks in advance
@upgradenow7 ай бұрын
Hi, There shouldn't be a maximum - the script should be able to handle whatever the database can handle. We recently launched a new version of the M5 script which handled a case with 1.000s of tablespaces better. In another case I worked on, we had almost 10.000 data files. If you run into troubles, you have really many, many tablespaces. In that case, create a service request. Regards, Daniel
@manzambibissengomatila73837 ай бұрын
@@upgradenow , We have 3329 tablespaces, 5714 datafiles and DB size 390 TB. is it possible to go with recent M5 script, please?