Skip to content

Commit

Permalink
Merge pull request #3039 from EnterpriseDB/release/2022-08-12
Browse files Browse the repository at this point in the history
Release: 2022-08-12
  • Loading branch information
drothery-edb authored Aug 12, 2022
2 parents 5ebdc5c + acb429b commit ab88665
Show file tree
Hide file tree
Showing 5 changed files with 68 additions and 35 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -60,20 +60,20 @@ Perform the following steps:
setup-csp --provider
{--account-id | --subscription-id}
--region
[--instance-type --high-availability --networking | --skip-preflight]
[--instance-type --cluster-architecture --networking | --skip-preflight]
[--run]
```
!!! Important
Do not delete the `ba-passport.json` file created in your working directory. It contains important identity and access management information used by `connect-csp` while connecting to your cloud.
Here is an example of setting up an AWS account:

```shell
biganimal setup-csp --provider aws --account-id 123456789102 --region us-east-1 --instance-type aws:r5.large --high-availability --networking private --run
biganimal setup-csp --provider aws --account-id 123456789102 --region us-east-1 --instance-type aws:r5.large --cluster-architecture ha --networking private --run
```

Here is an example if setting up an Azure account:
```shell
biganimal setup-csp --provider azure --subscription-id abc12345-1234-1234-abcd-12345678901 --region eastus --instance-type azure:Standard_E4s_v3 --high-availability --networking private --run
biganimal setup-csp --provider azure --subscription-id abc12345-1234-1234-abcd-12345678901 --region eastus --instance-type azure:Standard_E4s_v3 --cluster-architecture ha --networking private --run
```
For more information on the command arguments, run the following command:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: "Extracting schemas using data pump utility"
title: "Extracting schemas using Oracle Data Pump utilities"

---

Expand Down Expand Up @@ -31,7 +31,7 @@ Perform either of the following procedures:

- [Extract all schemas in a database](#extract-all-schemas-in-a-database)

### Extract one or more schemas in a database
## Extract one or more schemas in a database

1. Before running the `expdp` command, create a file with a `.par` extension (for example, `export.par`) on your server. Add attributes and values to the file:

Expand Down Expand Up @@ -61,9 +61,9 @@ Perform either of the following procedures:

```

See an important [note](../01_mp_schema_extraction/#about-file-encoding) about the file encoding format expected by the Migration Portal.
See [file encoding](known_issues_notes/#file-encoding) for information about the file encoding format expected by the Migration Portal.

### Extract all schemas in a database
## Extract all schemas in a database

!!! Note
Don't perform this procedure from a user account that belongs to the excluded schemas list (see [Unsupported schemas](../01_mp_schema_extraction/#unsupported-schemas)). The `impdp` command fails if the user account running the command is in the excluded list of schemas.
Expand Down Expand Up @@ -113,4 +113,4 @@ See an important [note](../01_mp_schema_extraction/#about-file-encoding) about t
```shell
impdp <Username>@<ConnectIdentifier> DIRECTORY=DMPDIR TRANSFORM=OID:n,SEGMENT_ATTRIBUTES:n SQLFILE=YourSchemas.sql DUMPFILE=schemas_metadata.dump parfile=import.par
```
See an important [note](../01_mp_schema_extraction/#about-file-encoding) about the file encoding format expected by the Migration Portal.
See [file encoding](known_issues_notes/#file-encoding) for information about the file encoding format expected by the Migration Portal.
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,19 @@ legacyRedirectsGenerated:
redirects:
- ../01_whats_new/

navigation:
- "01_data_pump_utility"
- "known_issues_notes"

---

<div id="mp_schema_extraction" class="registered_link"></div>

You can perform a schema extraction using either of the following methods. EDB recommends using the EDB DDL extractor to extract your schemas.
- [EDB DDL extractor](#extracting-schemas-using-the-edb-ddl-extractor) (recommended method)
- [Oracle’s Data Pump utility](01_data_pump_utility/)
- [EDB DDL Extractor](#extracting-schemas-using-the-edb-ddl-extractor) (recommended method)
- [Oracle Data Pump utilities](01_data_pump_utility/)

For more information, see [Known issues, limitations, and notes](known_issues_notes/).

## Extracting schemas using the EDB DDL Extractor

Expand All @@ -25,7 +31,7 @@ Download the latest EDB DDL Extractor script from the Migration Portal Projects

### Prerequisites

You can run the EDB DDL Extractor script in SQL Developer or SQL\*Plus. It uses Oracle’s `DBMS_METADATA` built-in package to extract DDLs for different objects under schemas (specified while running the script). The EDB DDL extractor creates the DDL file uploaded to the portal and analyzed for EDB Postgres Advanced Server compatibility.
You can run the EDB DDL Extractor script in SQL Developer or SQL\*Plus. It uses Oracle’s `DBMS_METADATA` built-in package to extract DDLs for different objects under schemas (specified while running the script). The EDB DDL Extractor creates the DDL file uploaded to the portal and analyzed for EDB Postgres Advanced Server compatibility.

!!! Note
You must have `CONNECT` and `SELECT_CATALOG_ROLE` roles and `CREATE TABLE` privilege.
Expand Down Expand Up @@ -93,20 +99,12 @@ Output of the DDL Extractor run appears in the Script Output tab. The name of th

The script then iterates through the object types in the source database. Once the task is completed, the `.SQL` output is stored at the location you entered (e.g., `c:\Users\Example\Desktop\`).

See an important [note](#about-file-encoding) about the file encoding format expected by Migration Portal.
See [file encoding](known_issues_notes/#file-encoding) for information about the file encoding format expected by Migration Portal.

### Additional notes

- The EDB DDL Extractor script doesn't extract objects restored using `Flashback` that still have names like `BIN$b54+4XlEYwPgUAB/AQBWwA==$0`. If you want to extract these objects, you must change the names of the objects and rerun the extraction process.
- The EDB DDL Extractor extracts `nologging` tables as normal tables. Once these tables are migrated to EDB Postgres Advanced Server, WAL log files are created.
- The EDB DDL Extractor extracts objects only with VALID status. For any objects that have INVALID status that you want Migration Portal to assess, first update them to VALID.
- The EDB DDL Extractor doesn't extract objects that were obfuscated using the Oracle wrap feature. As such, these objects aren't included in the set of DDL assessed by the Migration Portal. If you want to assess these objects and migrate them to EDB Postgres Advanced Server, replace the wrapped versions of the objects with nonwrapped versions. See [Wrapped objects](#wrapped-objects) for more information.
- The EDB DDL Extractor creates Global Temporary tables to store the schema names and their dependency information. These tables are dropped at the end of successful extraction.
- The EDB DDL Extractor script doesn't extract schemas whose name starts with `PG_` because PostgreSQL doesn't support it. If you want to extract these schemas, you must change the name of the schema before extraction.

## Schemas and objects support

The lists and tables that follow show supported and unsupported schemas and objects.
The lists and tables that follow show supported and unsupported schemas and objects in Migration Portal.

### Unsupported schemas

Expand Down Expand Up @@ -163,17 +161,8 @@ Exclude these Oracle systems schemas while generating the SQL dump file.
- Directories
- Users
- RLS policy
- Queues

## Wrapped objects

Migration Portal can't assess wrapped objects. If you include them in the extracted DDL, they aren't loaded into Migration Portal and aren't included in the count of objects that are assessed. If you want to assess wrapped objects and migrate them to EDB Postgres Advanced Server, include unwrapped versions of the objects in the DDL file that you upload to Migration Portal. The recommended way of doing this is to replace the wrapped versions in the Oracle database with clear-text versions before performing the schema extraction. After performing the schema extraction, you can replace the objects with the wrapped versions.

## About file encoding

Migration Portal recommends the `.SQL` output file be in the UTF-8 encoding format. If you upload a `.SQL` file with non-UTF-8 encoding, all the characters that aren't compatible with UTF-8 are converted to the replacement character ‘�’ in the output DDL.
- Queues *
- Library
- Indextype

!!! Tip
You can manually convert the extracted file to the UTF-8 format by using the iconv utility on Linux or the LibIconv utility on Windows. For example, if your database character set is in Latin-1 (ISO-8859-1), you can convert the extracted file to the UTF-8 format, as follows:

`iconv -f iso-8859-1 -t UTF-8 sample.sql > sample_utf8.sql`
\* Even though EDB Postgres Advanced Server provides support for Queue tables, Migration Portal does not currently support it. Queue tables in the source DDL are not uploaded as source and target DDL objects.
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
---
title: "Known issues, limitations, and notes"

---

This page lists known issues, limitations, and notes for:

- [Migration Portal](#migration-portal)
- [EDB DDL Extractor](#edb-ddl-extractor)
- [Oracle Data Pump utilities](#oracle-data-pump-utilities)

## Migration Portal

### Wrapped objects

Migration Portal can't assess wrapped objects. If you include them in the extracted DDL, they aren't loaded into Migration Portal and aren't included in the count of objects that are assessed. If you want to assess wrapped objects and migrate them to EDB Postgres Advanced Server, include unwrapped versions of the objects in the DDL file that you upload to Migration Portal. The recommended way of doing this is to replace the wrapped versions in the Oracle database with clear-text versions before performing the schema extraction. After performing the schema extraction, you can replace the objects with the wrapped versions.

### File encoding

Migration Portal recommends the `.SQL` output file be in the UTF-8 encoding format. If you upload a `.SQL` file with non-UTF-8 encoding, all the characters that aren't compatible with UTF-8 are converted to the replacement character ‘�’ in the output DDL.

!!! Tip
You can manually convert the extracted file to the UTF-8 format by using the iconv utility on Linux or the LibIconv utility on Windows. For example, if your database character set is in Latin-1 (ISO-8859-1), you can convert the extracted file to the UTF-8 format, as follows:

`iconv -f iso-8859-1 -t UTF-8 sample.sql > sample_utf8.sql`

### ALTER statements

Except for ALTER TABLE and ALTER TRIGGER, Migration Portal does not process any other ALTER statements in the DDL.

## EDB DDL Extractor

- The EDB DDL Extractor script doesn't extract objects restored using `Flashback` that still have names like `BIN$b54+4XlEYwPgUAB/AQBWwA==$0`. If you want to extract these objects, you must change the names of the objects and rerun the extraction process.
- The EDB DDL Extractor extracts `nologging` tables as normal tables. Once these tables are migrated to EDB Postgres Advanced Server, WAL log files are created.
- The EDB DDL Extractor extracts objects only with VALID status. For any objects that have INVALID status that you want Migration Portal to assess, first update them to VALID.
- The EDB DDL Extractor doesn't extract objects that were obfuscated using the Oracle wrap feature. As such, these objects aren't included in the set of DDL assessed by the Migration Portal. If you want to assess these objects and migrate them to EDB Postgres Advanced Server, replace the wrapped versions of the objects with nonwrapped versions. See [Wrapped objects](#wrapped-objects) for more information.
- The EDB DDL Extractor creates Global Temporary tables to store the schema names and their dependency information. These tables are dropped at the end of successful extraction.
- The EDB DDL Extractor script doesn't extract schemas whose name starts with `PG_` because PostgreSQL doesn't support it. If you want to extract these schemas, you must change the name of the schema before extraction.

## Oracle Data Pump utilities

- Migration Portal might fail to parse your SQL file if you create a database link using the IDENTIFIED BY clause with Oracle's quote operator; for example, `IDENTIFIED BY VALUES q'[:1]'`. To parse your file successfully, try using an actual password; for example, `IDENTIFIED BY my_password`.

- The DDL generated by Oracle Data Pump utilities might contain ALTER STATEMENTS such as ALTER FUNCTION, ALTER PACKAGE, and ALTER TYPE, which are not processed by Migration Portal.
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ You can optionally select **Generate PDF** to save the report in PDF format. You
!!! Note
Migration Portal doesn't assess sensitive PL/SQL source code hidden in Oracle wrapped objects. These wrapped objects aren't included in the assessed objects count, and therefore the true value of compatibility percentage might be different from the value calculated in the assessment report.

See the note in [Performing a schema extraction](01_mp_schema_extraction#wrapped-objects) for more information about wrapped objects.
See the note in [Known issues and notes](01_mp_schema_extraction/known_issues_notes#wrapped-objects) for more information about wrapped objects.

## Evaluate an assessment report

Expand Down

0 comments on commit ab88665

Please sign in to comment.