Task: copy all data in a database (without schema) to another database (possibly of a different type). I can't modify source database, so it is a read-only backup
Context: integrate Oracle with a number of DBs. Now I'm integrating Oracle and Postgres.
Resources: connection string only, with ability to connect database with highest available privileges. (I can't access it via ssh - no way to create ordinal backup and download files via ssh, or to compile and start web/ftp server, etc.)
Question: Is there any proven and FAST way to pull this data? Maybe someone has an open source solution with clean code?
The word "fast" is present here because just selecting N rows in a turn (using rownum or row_number()) and transfering to a target database or intermediate file is too slow.
SELECTfor all the rows and letting the client fetch the data in batches?dbms_datapumpfrom within SQL) but you actually want to copy the data to a different DBMS - which is something completely different than "taking a backup"COPYin Postgres to import the flat files.