README
     Progress(R) DataDirect(R)
     DataDirect Connect(R) for JDBC
     DataDirect Connect XE (Extended Edition) for JDBC
     Release 5.1.4
     February 14, 2019

***********************************************************************
Copyright (c) 1994-2019 Progress Software Corporation and/or its 
subsidiaries or affiliates. All Rights Reserved.

***********************************************************************


CONTENTS

Requirements
Installation Directory
Changes since Service Pack 4
Changes for Service Pack 4
Changes for Service Pack 3
Changes for Service Pack 2
Changes for Service Pack 1
Release 5.1.0 Features
Installation/Uninstallation
Available DataDirect Connect Series for JDBC Drivers
Notes, Known Problems, and Restrictions
Using the Documents
DataDirect Connect Series for JDBC Files


     Requirements

A supported JVM must be defined on your system path.

* Java SE 7 or higher is required to use the Salesforce driver.

* Java SE 5 or higher is required for other Connect for JDBC drivers.


     Installation Directory

The default installation directory for DataDirect Connect for JDBC and
DataDirect Connect XE for JDBC is:

* Windows:
  C:\Program Files\Progress\DataDirect\Connect_for_JDBC_51

* UNIX/Linux:
  /opt/Progress/DataDirect/Connect_for_JDBC_51


     Changes since Service Pack 4


Certifications
--------------
* The driver for Apache Hive has been certified with the following versions:
  
  - Oracle JDK 11
    Driver version 5.1.4.000178 (F000376.U000183)

  - OpenJDK 11
    Driver version 5.1.4.000178 (F000376.U000183)

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000178 (F000376.U000183)

  - Apache Hive 2.0 and 2.1.
    Driver version 5.1.4.000134 (F000295.U000128)

  - Apache Hive 1.2.
    Driver version 5.1.4.000111 (F000232.U000096)

  - Apache Hive 1.0 and 1.1.
    Driver version 5.1.4.000100 (F000202.U000087)

* The driver for Apache Hive has been certified with the following
  distributions:

  - Amazon (AMI) 3.7
    Driver version 5.1.4.000100 (F000202.U00008)

  - Cloudera (CDH) 5.8, 5.9, 5.10, 5.11, 5.12
    Driver version 5.1.4.000147 (F000318.U000137)

  - Cloudera (CDH) 5.6 and 5.7
    Driver version 5.1.4.000134 (F000295.U000128)

  - Cloudera (CDH) 5.4 and 5.5
    Driver version 5.1.4.000128 (F000272.U000122)

  - Cloudera (CDH) 5.3
    Driver version 5.1.4.000100 (F000202.U00005)

  - Hortonworks (HDP) 2.5
    Driver version 5.1.4.000136 (F000301.U000131)

  - Hortonworks (HDP) 2.3 and 2.4
    Driver version 5.1.4.000111 (F000232.U000096)

  - IBM BigInsights 4.1
    Driver version 5.1.4.000128 (F000272.U000122)

  - IBM BigInsights 3.0
    Driver version 5.1.4.000100 (F000202.U00008)

  - MapR 5.2
    Driver version 5.1.4.000147 (F000318.U000137)

  - Pivotal (PHD) 2.1
    Driver version 5.1.4.000100 (F000202.U00008)

* The DB2 driver has been certified with the following versions: 

  - Oracle JDK 11
    Driver version 5.1.4.000237 (F000378.U000183) 

  - OpenJDK 11
    Driver version 5.1.4.000237 (F000378.U000183) 

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000237 (F000378.U000183) 

  - DB2 V12 for z/OS.
    Driver version 5.1.4.000217 (F000355.U000174)

  - IBM dashDB (now IBM Db2 Warehouse on Cloud)
    Driver version 5.1.4.000207 (F000348.U000165)	

  - IBM DB2 for i 7.3.
    Driver version 5.1.4.000187 (F000323.U000141) 

  - IBM DB2 V11.1 for LUW.
    Driver version 5.1.4.000173 (F000289.U000127)

* The Greenplum driver has been certified with the following versions: 

  - Oracle JDK 11
    Driver version 5.1.4.000183 (F000381.U000186) 

  - OpenJDK 11
    Driver version 5.1.4.000183 (F000381.U000186) 

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000183 (F000381.U000186) 

  - Greenplum 5.0, 5.1, 5.2, 5.3, 5.4, 5.5.
    Driver version 5.1.4.000162 (F000369.U000179)

  - Pivotal HDB (HAWQ) 2.0.
    Driver version 5.1.4.000118 (F000228.U000096)

* The IBM Informix driver has been certified with the following versions:

  - Oracle JDK 11
    Driver version 5.1.4.000092 (F000280.U000123)  

  - OpenJDK 11
    Driver version 5.1.4.000092 (F000280.U000123)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000092 (F000280.U000123)   

* The MongoDB driver has been certified with the following versions:

  - Oracle JDK 11
    Driver version 5.1.4.000108(C0188.F000209.U000089)  

  - OpenJDK 11
    Driver version 5.1.4.000108(C0188.F000209.U000089)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000108(C0188.F000209.U000089)

* The MySQL driver has been certified with the following versions:

  - Oracle JDK 11
    Driver version 5.1.4.000181 (F000381.U000186)  

  - OpenJDK 11
    Driver version 5.1.4.000181 (F000381.U000186)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000181 (F000381.U000186)

* The OpenEdge driver has been certified with the following versions: 

  - Oracle JDK 11
    Driver version 5.1.4.000142 (F000376.U000183)  

  - OpenJDK 11
    Driver version 5.1.4.000142 (F000376.U000183)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000142 (F000376.U000183)  

  - Progress OpenEdge 11.7 
    Driver version 5.1.4.000130 (F000322.U000140) 

  - Progress OpenEdge 11.6
    Driver version 5.1.4.000127 (F000281.U000124)

* The Oracle driver has been certified with the following versions:  
  
  - Oracle JDK 11
    Driver version 5.1.4.000466 (F000381.U000186)

  - OpenJDK 11 
    Driver version 5.1.4.000466 (F000381.U000186)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000466 (F000381.U000186)

  - Oracle 18c (18.3)
    Driver version 5.1.4.000454 (F000378.U000183)

  - Oracle Database Cloud Service 18c R1 (18.1)
    Driver version 5.1.4.000454 (F000378.U000183)

  - Oracle 18c R1 (18.1)
    Driver version 5.1.4.000454 (F000378.U000183)

  - Oracle Autonomous Data Warehouse Cloud 12c R2 (12.2)
    Driver version 5.1.4.000438 (F000360.U000178)
  
  - Oracle 12c R2 (12.2).
    Driver version 5.1.4.000391 (F000327.U000148)   
  
* The PostgreSQL driver has been certified with the following versions:

  - Oracle JDK 11
    Driver version 5.1.4.000134 (F000378.U000183)

  - OpenJDK 11 
    Driver version 5.1.4.000134 (F000378.U000183)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000134 (F000378.U000183)

  - Amazon Aurora PostgreSQL 1.0.
    Driver version 5.1.4.000130 (F000375.U000182)

  - PostgreSQL 10.1.
    Driver version 5.1.4.000121 (F000358.U000177)

  - PostgreSQL 9.5 and 9.6.
    Driver version 5.1.4.000099 (F000274.U000123)

* The Salesforce driver has been certified with the following versions: 
  
  - Oracle JDK 11
    Driver version 5.1.4.000226 (F000381.U000186)

  - OpenJDK 11 
    Driver version 5.1.4.000226 (F000381.U000186)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000226 (F000381.U000186)

  - Salesforce API version 38
    Driver version 5.1.4.000185 (C0242.F000319.U000137)
  
  - Salesforce API versions 33 and 34.
    Driver version 5.1.4.000136 (C0195.F000226.U000096)

* The SQL Server driver has been certified with the following versions:

  - Oracle JDK 11
    Driver version 5.1.4.000242 (F000381.U000186)

  - OpenJDK 11 
    Driver version 5.1.4.000242 (F000381.U000186)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000242 (F000381.U000186)

  - Microsoft SQL Server 2016
    Driver version 5.1.4.000161 (F000289.U000127)

* The Sybase driver has been certified with the following versions:

  - Oracle JDK 11
    Driver version 5.1.4.000118 (F000374.U000181)  

  - OpenJDK 11
    Driver version 5.1.4.000118 (F000374.U000181)  

  - OpenJDK 8 on Windows and Linux
    Driver version 5.1.4.000118 (F000374.U000181)


Kerberos krb5.conf File
-----------------------
The drivers for DB2, SQL Server, Oracle, Sybase and Hive support Kerberos
authentication. These drivers no longer set the java.security.krb5.conf system
property to force the use of the krb5.conf file installed with the driver jar
files in the /lib directory of the product installation directory. For the
latest instructions on configuring the drivers for Kerberos, see the "Notes,
Known Problems, and Restrictions" section below.

Statement Pooling
-----------------
Added the RegisterStatementPoolMonitorMBean connection property. Note that the
driver no longer registers the Statement Pool Monitor as a JMX MBean by default.
You must set RegisterStatementPoolMonitorMBean to true to register the Statement
Pool Monitor and manage statement pooling with standard JMX API calls.

Driver for Apache Hive
----------------------
* The BatchMechanism connection property has been added to the driver. When 
  BatchMechanism is set to multiRowInsert, the driver executes a single insert
  for all the rows contained in a parameter array. MultiRowInsert is the default
  setting and provides substantial performance gains when performing batch
  inserts.

* The driver has been enhanced to support SSL for Apache Hive 0.13.0 and higher,
  incorporating the addition of nine new connection properties.

* The driver's Kerberos functionality has been enhanced to support SASL-QOP
  data integrity and confidentiality.

DB2 Driver
----------
* For DB2 for z/OS, the AlternateID connection property has been modified to set
  the name of the schema in the DB2 CURRENT SCHEMA special register instead of
  the DB2 CURRENT SQLID special register. AlternateID now sets the name of the
  schema in the CURRENT SCHEMA special register for DB2 for i, DB2 for
  Linux/UNIX/Windows, and DB2 for z/OS.

* Added support for cursor type OUT parameters for DB2 for Linux, UNIX, Windows
  stored procedures.

Greenplum Driver
----------------
* The driver has been enhanced to support the following data types:
  Citext, JSON, and UUID.

* Added support for the Kerberos authentication protocol with the following
  connection properties:
  - AuthenticationMethod
  - ServicePrincipalName
  
Microsoft SQL Server Driver 
---------------------------
The driver has been enhanced to support Azure Active Directory authentication 
(Azure AD authentication). Azure AD authentication is an alternative to SQL 
Server Authentication that allows administrators to centrally manage user 
permissions to Azure SQL Database data stores. To enable Azure AD 
authentication, the new ActiveDirectoryPassword value must be specified for the
AuthenticationMethod connection property 
(AuthenticationMethod=ActiveDirectoryPassword). In addition, the appropriate 
values must be specified for the HostNameInCertificate, User and Password 
connection properties. Refer to “Azure Active Directory Authentication” and 
“AuthenticationMethod” sections in the user’s guide for more information.
 
MySQL Driver
------------
The driver has been enhanced to support sha256_password and
caching_sha2_password authentication plugins.
 
Oracle Driver
-------------
* Support has been added for Oracle Database Vault.

* Support has been added for the Oracle Database Exadata Cloud Service.

* The Oracle driver has been enhanced to support Oracle Wallet SSL
  authentication introduced in Oracle 11.1.0.6. The AuthenticationMethod
  connection property should be set to either SSL or SSLUIDPassword to allow
  SSL authentication when connecting with the driver.

  - When AuthenticationMethod=SSL, the driver uses SSL certificate information
    to authenticate the client with the server. The Keystore property must
    specify the Oracle Wallet that has the SSL certificate information, and the
    KeystorePassword property must specify the required password. The User and
    Password properties should not be specified. Here is an example connection
    URL.
     jdbc:datadirect:oracle://server3:1521;SID=ASC;encryptionMethod=ssl;
        authenticationMethod=SSL;keystore=.\wallets\keystore.p12;
        keystorePassword=Passw0rd
  
  - When AuthenticationMethod=SSLUIDPassword, the driver uses user ID/password
    and SSL authentication to connect with the server. The User and Password
    properties must be specified, the Keystore property must specify the Oracle
    Wallet that has the SSL certificate information, and the KeystorePassword
    property must specify the required password. Here is an example connection
    URL.
     jdbc:datadirect:oracle://server3:1521;SID=ASC;User=test;
        Password=secret;encryptionMethod=ssl;authenticationMethod=SSL;
        keystore=.\wallets\keystore.p12;keystorePassword=Passw0rd

  Note: When Oracle Wallet SSO is used as the keystore or truststore, the 
        keystorePassword and truststorePassword properties are not required.

* The Oracle driver has been enhanced to support the following new data 
  integrity algorithms for Oracle 12c and higher:
  - SHA256
  - SHA384
  - SHA512

  To use these algorithms, specify their values with the DataIntegrity connection
  property (for example, DataIntegrity=(SHA256,SHA384)) and enable data integrity
  checks with the DataIntegrityLevel property. For more information on using 
  these properties, refer to the DATADIRECT CONNECT SERIES FOR JDBC USER'S GUIDE.	 

* The LOBPrefetchSize connection property has been added to the driver and is
  supported for Oracle database versions 12.1.0.1 and higher. This connection
  property allows you to specify the size of prefetch data the driver returns 
  for BLOBs and CLOBs. With LOB prefetch enabled, the driver can return LOB 
  meta-data and the beginning of LOB data along with the LOB locator during a 
  fetch operation. This can have significant performance impact, especially for
  small LOBs which can potentially be entirely prefetched, because the data is 
  available without having to go through the LOB protocol.
  
PostgreSQL Driver
-----------------
* The driver has been enhanced to support the following data types:
  Citext, JSON, JSONB, and UUID.

* Added support for the Kerberos authentication protocol with the following
  connection properties:
  - AuthenticationMethod
  - ServicePrincipalName

* The ExtendedColumnMetadata connection property has been added to the driver.
  This property determines how the driver returns column metadata when 
  retrieving results with ResultSetMetaData methods.  

Salesforce JVM Requirement
--------------------------
The Salesforce driver has been updated to require a Java SE 7 or higher JVM to
comply with revisions to Salesforce security standards. Beginning June 25th,
2016, Salesforce is deprecating support for the TLS 1.0 encryption protocol for
inbound and outbound connections. TLS 1.0 is initially being disabled for
Sandbox instances before being retired for all instances in early 2017. To
maintain compatibility with Salesforce services, the driver must use a JVM that
allows TLS 1.0 to be disabled independently of other encryption protocols. This
functionality is available with Java SE 7 and higher. Therefore, Java SE 7 or
higher must be installed on your system and the JVM must be defined on your
system path to use the driver and Salesforce services. Beginning with build
5.1.4.000146, the driver will return an error if you attempt to connect using a
Java SE 6 or earlier JVM. For more information on the Salesforce disablement of
TLS 1.0, refer to:
https://help.salesforce.com/apex/HTViewSolution?id=000221207#Whatischange

Sybase Driver  
-------------
* The version of Bouncy Castle that ships with the driver has been upgraded to 
  1.60, which fixes the following security vulnerabilities:
  - CVE-2018-1000613
  - CVE-2018-1000180
  - CVE-2017-13098
  This upgrade is available starting in build 5.1.4.000120 of the driver. For 
  more information on the vulnerabilities resolved by this upgrade, refer 
  to https://nvd.nist.gov/.  
  

     Changes for Service Pack 4

Certifications
--------------
* The OpenEdge driver has been certified with Progress OpenEdge 11.4 and 11.5.

* The PostgreSQL driver has been certified with PostgreSQL 9.3 and 9.4.

* The Greenplum driver has been certified with Greenplum 4.3 and Pivotal
  HAWQ 1.2.

* The DB2 driver has been certified with DB2 for i 7.2.

* The driver for Apache Hive has been certified with Apache Hive 0.13 and 0.14.

* The driver for Apache Hive has been certified with the following
  distributions:
  - Hortonworks (HDP) 2.2 with Apache Hive 0.14
  - Cloudera (CDH) 5.2 with Apache Hive 0.13
  - Amazon (AMI) 3.2-3.3.1 with Apache Hive 0.13
  - Hortonworks (HDP) 2.1 with Apache Hive 0.13
  - Cloudera (CDH) 5.0 and 5.1 Apache Hive 0.12

* The Sybase driver has been certified with SAP Adaptive Server Enterprise 16.0
  (formerly Sybase Adaptive Server Enterprise 16.0).

DB2 Driver
----------
The connection properties RandomGenerator and SecureRandomAlgorithm have been
added to the driver.

* RandomGenerator allows you to specify the type of random number generator
  (RNG) the database uses for secure seeding.

* SecureRandomAlgorithm can be used to specify the SecureRandom number
  generation algorithm used for secure seeding with implementations of JDK 8
  or higher when RandomGenerator is set to secureRandom.

Oracle Driver
-------------
* The SDUSize connection property has been added to the driver. This connection
  property allows you to specify the size in bytes of the Session Data Unit
  (SDU) that the driver requests when connecting to the server.

* The SupportBinaryXML connection property has been added to the driver. This
  connection property enables the driver to support XMLType with binary storage
  on servers running Oracle 12C and higher.

* The connection properties RandomGenerator and SecureRandomAlgorithm have been
  added to the driver.
  - RandomGenerator allows you to specify the type of random number generator
    (RNG) the database uses for secure seeding.
  - SecureRandomAlgorithm can be used to specify the SecureRandom number
    generation algorithm used for secure seeding with implementations of JDK 8
    or higher when RandomGenerator is set to secureRandom.

SQL Server
----------
Support for NTLMv2 has been added to the driver. You can use the
AuthenticationMethod connection property to specify that the driver use NTLMv2
authentication when establishing a connection.

Driver for Apache Hive
----------------------
* Support for row-level inserts has been added to the driver.

* The driver has been enhanced to support the Char, Decimal, Date, and Varchar
  data types.

CryptoProtocolVersion Connection Property
-----------------------------------------
To avoid vulnerabilities associated with SSLv3 and SSLv2, including the POODLE
vulnerability, this connection property can be used with any of the following
drivers.
  - DB2 driver            - PostgreSQL driver
  - Greenplum driver      - Progress OpenEdge driver
  - MySQL driver          - Microsoft SQL Server driver
  - Oracle driver         - Sybase driver

Result Set Holdability
----------------------
Support for result set holdability has been added to the driver.


     Changes for Service Pack 3

Certifications
--------------
* The DB2 driver has been certified with DB2 V11 for z/OS.

* The SQL Server driver has been certified with Microsoft SQL Server 2014.

* The Salesforce driver has been certified with Salesforce API Version 29.


     Changes for Service Pack 2

Certifications
--------------
* The driver for Apache Hive has been certified with Apache Hive 0.11 and 0.12.

* The driver for Apache Hive has been certified with the following
  distributions:
  - Amazon EMR with Apache Hive 0.11
  - Apache Hadoop Hive with Apache Hive 0.11 and 0.12
  - Cloudera (CDH) 4.3, 4.4, and 4.5 with Apache Hive 0.10, 0.11, and 0.12
  - Hortonworks (HDP) 1.3 with Apache Hive 0.11
  - Hortonworks (HDP) 2.0 with Apache Hive 0.12

* The DB2 driver has been certified with DB2 V10.5 for Linux/UNIX/Windows.

* The Greenplum driver has been certified with Pivotal HAWQ 1.1.

* The Informix driver has been certified with Informix 12.10.

* The OpenEdge driver has been certified with Progress OpenEdge 11.1, 11.2, and
  11.3.

* The Oracle driver has been certified with Oracle 12c.

* The Salesforce driver has been certified with Salesforce API Version 28.

Driver for Apache Hive
----------------------
Added support for the Kerberos authentication protocol with the following
connection properties:

* AuthenticationMethod

* ServicePrincipalName

Greenplum Driver
----------------
* Added SSL support for Greenplum 4.2, incorporating eight additional connection
  properties

* Added SupportsCatalogs connection property, which enables driver support for
  catalog calls

* Added four connection properties to handle VARCHAR, LONGVARCHAR, and NUMERIC
  data types: MaxVarcharSize, MaxLongVarcharSize, MaxNumericPrecision, and
  MaxNumericScale

Oracle Driver
-------------
Modified to support all Oracle 11gR2 Kerberos encryption algorithms

PostgreSQL Driver
-----------------
* Added SupportsCatalogs connection property, which enables driver support for
  catalog calls

* Added four connection properties to handle VARCHAR, LONGVARCHAR, and NUMERIC
  data types: MaxVarcharSize, MaxLongVarcharSize, MaxNumericPrecision, and
  MaxNumericScale


     Changes for Service Pack 1

New Drivers
-----------
* Driver for Apache Hive(TM)
  - Supports Apache Hive 0.8.0 and higher
  - Supports HiveServer1 and HiveServer2 protocols
  - Supports Hive distributions:
    - Amazon Elastic MapReduce (Amazon EMR)
    - Apache Hadoop Hive
    - Cloudera’s Distribution Including Apache Hadoop (CDH)
    - MapR Distribution for Apache Hadoop (MapR)
  - Returns result set metadata for parameterized statements that have been
    prepared but not yet executed
  - Supports connection pooling
  - Includes the LoginTimeout connection property which allows you to specify
    the amount of time the driver waits for a connection to be established
    before timing out the connection request
  - Includes the TransactionMode connection property which allows you to
    configure the driver to report that it supports transactions, even though
    Hive does not support transactions. This provides a workaround for
    applications which do not operate with a driver that reports transactions
    are not supported.
  - The driver provides support for the following standard SQL functionality:
    - Create Table and Create View
    - Insert
    - Drop Table and Drop View
    - Batches in HiveServer2 connections

* Greenplum Driver
  - Supports Greenplum database versions 4.2, 4.1, 4.0, 3.3, 3.2
  - Supports connection pooling
  - Supports the DataDirect Bulk Load API
  - Includes the TransactionErrorBehavior connection property which determines
    how the driver handles errors that occur within a transaction
  - Includes the LoginTimeout connection property which allows you to specify
    the amount of time the driver waits for a connection to be established
    before timing out the connection request

* PostgreSQL Driver
  - Supports PostgreSQL database versions 9.2, 9.1, 9.0, 8.4, 8.3, 8.2
  - Supports SSL protocol for sending encrypted data
  - Supports connection pooling
  - Supports the DataDirect Bulk Load API
  - Includes the TransactionErrorBehavior connection property which determines
    how the driver handles errors that occur within a transaction
  - Includes the LoginTimeout connection property which allows you to specify
    the amount of time the driver waits for a connection to be established
    before timing out the connection request

Certifications
--------------
* The Salesforce driver has been certified with Salesforce API Version 27.

* The MySQL driver has been certified with MySQL 5.6.

DataDirect Spy
--------------
Enhanced to throw warning when EnableBulkLoad fails in Oracle, SQL Server,
Sybase, and Salesforce drivers.

Oracle Driver
-------------
Added support for Oracle Wallet

SQL Server Driver
-----------------
* Added ApplicationIntent connection property, which enables you to request
  read-only routing and connect to read-only database replicas.

* Enhanced drivers so that transaction isolation level may only be changed
  before the transaction is started.

Sybase Driver
-------------
Enhanced AuthenticationMethod connection property to allow for the driver to
send a user ID in clear text and an encrypted password to the server for
authentication.


     Release 5.1.0 Features

Certifications
--------------
* The DB2 driver has been certified with DB2 V10.1 for Linux/UNIX/Windows.

* The DB2 driver has been certified with DB2 pureScale.

* The Salesforce driver has been certified with Salesforce API Version 26.

* The SQL Server driver has been certified with Microsoft SQL Server 2012.

* The SQL Server driver has been certified Microsoft Windows Azure SQL Database.

Oracle Driver
-------------
Support for the Oracle Advanced Security (OAS) data encryption and data
integrity feature, including support for the following new connection
properties:

* DataIntegrityLevel sets the level of OAS data integrity used for data sent
  between the driver and database server.

* DataIntegrityTypes specifies one or multiple algorithms to protect against 
  attacks that intercept and modify data being transmitted between the client 
  and server when OAS data integrity is enabled using the DataIntegrityLevel
  property.

* EncryptionLevel determines whether data is encrypted and decrypted when 
  transmitted over the network between the driver and database server using 
  OAS encryption.

* EncryptionTypes specifies one or multiple algorithms to use if OAS
  encryption is enabled using the EncryptionLevel property.

Salesforce Driver
-----------------
* The new KeywordConflictSuffix keyword=value pair for the ConfigOptions
  property allows you to specify a string that the driver appends to any object
  or field name that conflicts with a SQL engine keyword. For example, if you
  specify KeywordConflictSuffix=TAB, the driver maps the Case object in
  Salesforce to CASETAB.

* The new RefreshSchema connection property specifies whether the driver
  automatically refreshes the remote object mapping and other information
  contained in a remote schema the first time a user connects to an embedded
  database.


     Installation/Uninstallation

Installing
----------
A complete installation of both DataDirect Connect for JDBC and DataDirect
Connect XE for JDBC requires approximately 67 MB of hard disk space.

Java SE 5 or higher is required to use DataDirect Connect Series for JDBC.
Standard installations of Java SE on some platforms do not include the jar file
containing the extended encoding set that is required to  support some of the
less common database code pages. To verify whether your Java SE version provides
extended code page support, make sure that the charsets.jar file is installed
in the \lib subdirectory of your Java SE installation directory. If you do not 
have the charsets.jar file, install the international version of Java SE.

The installer accepts multiple product license keys. For details, refer to the
DATADIRECT CONNECT SERIES FOR JDBC INSTALLATION GUIDE.

Uninstalling on Windows
-----------------------
When you connect with the Salesforce driver, the driver creates multiple local
Salesforce files in the <install_dir>\testforjdbc subdirectory, where
<install_dir> is your product installation directory. If you connect using the
default Salesforce driver URL jdbc:datadirect:sforce://login.salesforce.com,
the names of these files are associated with your user name. For example, if
your user name is test01@xyz.com, the local Salesforce files that are created 
would be:

<install_dir>\testforjdbc\ddsforce.log
<install_dir>\testforjdbc\test01.app.log
<install_dir>\testforjdbc\test01.config
<install_dir>\testforjdbc\test01.log
<install_dir>\testforjdbc\test01.properties
<install_dir>\testforjdbc\test01.SFORCE.map
<install_dir>\testforjdbc\test01.SFORCE.native
  
When you run the Windows uninstaller, these files are not removed. You can
explicitly delete them.


     Available DataDirect Connect(R) Series for JDBC Drivers

See http://www.datadirect.com/products/jdbc/matrix/jdbcpublic.htm for a complete
list of supported databases/sources.

DataDirect Connect for JDBC Drivers
-----------------------------------
DB2 (db2.jar)
Informix (informix.jar) 
MySQL (mysql.jar)
Oracle (oracle.jar)
PostgreSQL (postgresql.jar) 
Progress OpenEdge (openedgewp.jar)
SQL Server (sqlserver.jar) 
Sybase (sybase.jar)

DataDirect Connect XE for JDBC Drivers
--------------------------------------
Apache Hive (hive.jar)
Greenplum (greenplum.jar)
Salesforce (sforce.jar)


     Notes, Known Problems, and Restrictions

The following are notes, known problems, or restrictions with Release 5.1.4 of
DataDirect Connect Series for JDBC.

dashDB Constraints for Tables
-----------------------------
By default, dashDB does not enforce constraints for tables. As a result, dashDB
will not enforce uniqueness on the new tables, and therefore, incorrect or 
unexpected results can occur if table data violates the not-enforced 
constraint. If you want to enforce uniqueness, specify the ENFORCED parameter 
when creating or altering unique or referential constraints, such as primary 
keys and foreign keys. 


Configuring the Drivers to Use Kerberos
---------------------------------------
The drivers for DB2, SQL Server, Oracle, Sybase and Hive support Kerberos
authentication. These drivers no longer set the java.security.krb5.conf system
property to force the use of the krb5.conf file installed with the driver jar
files in the /lib directory of the product installation directory. Here are the
latest instructions on configuring the drivers for Kerberos.

To configure a driver for Kerberos:

1. Set the AuthenticationMethod property to kerberos.

2. Specify the JAAS login module in your JAAS login configuration file using
   either of the following methods.
   - Modify the JDBC_DRIVER_01 entry in the JDBCDriverLogin.conf file to include
     the JAAS login module information needed for your environment. The
     JDBCDriverLogin.conf file is installed in the /lib directory of the driver
     installation directory.
   - Specify a JAAS login configuration file directly in your application with
     the java.security.auth.login.config system property. The specified login
     configuration file must contain the JAAS login module information with the
     entry JDBC_DRIVER_01.
   NOTE: Whether you are using the JDBCDriverLogin.conf file or another file,
   the login configuration file must contain the entry JDBC_DRIVER_01 with JAAS
   login module information. The following examples show that the JAAS login
   module information depends on your JRE.
      * Oracle JRE Example
        JDBC_DRIVER_01 {com.sun.security.auth.module.Krb5LoginModule
        required useTicketCache=true;};
      * IBM JRE Example
        JDBC_DRIVER_01 {com.ibm.security.auth.module.Krb5LoginModule
        required useDefaultCcache=true;};

3. Set the default realm name and the KDC name for that realm using either of
   the following methods. (If using Windows Active Directory, the Kerberos realm
   name is the Windows domain name and the KDC name is the Windows domain
   controller name.)
   - Modify the krb5.conf file to include the default realm name and the KDC
     name for that realm. For example, if your Kerberos realm name is XYZ.COM
     and your KDC name is kdc1, your krb5.conf file would include the following
     entries.
         [libdefaults] 
         default_realm = XYZ.COM
         [realms]
         XYZ.COM = {kdc = kdc1}
     NOTE: During installation, a krb5.conf file is installed in the /lib
     directory of the product installation directory. The installed krb5.conf
     file contains generic syntax for setting the default realm name and the KDC
     name for that realm. If you are not already using another krb5.conf file
     for your Kerberos implementation, you can modify it to suit your
     environment. However, you will either need to specify the location of this
     file using the java.security.krb5.conf system property, or you will need to
     add the file to a directory where it may be found by your JVM. See "Keberos
     Requirements" in your Java documentation for details on the algorithm used
     to locate the krb5.conf file.
   - Specify the Java system properties, java.security.krb5.realm and
     java.security.krb5.kdc, in your application. For example, if the default
     realm name is XYZ.COM and the KDC name is kdc1, your application would
     include the following settings.
         System.setProperty("java.security.krb5.realm","XYZ.COM");
         System.setProperty("java.security.krb5.kdc","kdc1")
     NOTE: Even if you do not use the krb5.conf file to specify the realm and
     KDC names, you may need to modify your krb5.conf file to suit your
     environment. Refer to your database vendor documentation for detailed
     information.

4. If using Kerberos authentication with a Security Manager on a Java Platform,
   you must grant security permissions to the application and driver. See
   "Permissions for Kerberos Authentication" in the User's Guide for examples.

Using the XMLType Data Type with the Oracle Driver
--------------------------------------------------
The default XML storage type was changed from CLOB to BINARY in Oracle 11.2.0.2.
For Oracle 11.2.0.1 and earlier database versions, the driver fully supports the
XML storage type with CLOB. For Oracle 12c and later, you can enable the driver
to support XML storage with BINARY by setting the SupportBinaryXML connection
property to true.

For database versions that fall between Oracle 11.2.0.1 and 12c, columns created
simply as "XMLType" are not supported by the driver. An attempt to obtain the
value of such a column through the driver results in the exception "This column
type is not currently supported by this driver." To avoid this exception, change
the XML storage type to CHARACTER (CLOB) or use the TO_CLOB Oracle function to
cast the column.

Using Kerberos Authentication with DB2 V10.5 for Linux, UNIX, and Windows
-------------------------------------------------------------------------
The DB2 driver cannot authenticate with Keberos due to the behavior of the
DB2 V10.5 for Linux, UNIX, and Windows server. The issue has been opened with
IBM: PMR 06453,756,000.

Stored Procedures and Updates/Deletes for Pivotal HAWQ 1.1
----------------------------------------------------------
The Greenplum driver does not support stored procedures and updates/deletes for
Pivotal HAWQ 1.1.

Oracle 12c Server Issues
------------------------
The following issues were discovered during certification with Oracle 12c.

* Kerberos cannot be configured successfully with Oracle 12c. This server issue
  has been reported to Oracle. Oracle has acknowledged the defect:
  Bug 17497520 - KERBEROS CONNECTIONS USING A 12C CLIENT AND THE OKINIT
  REQUESTED TGT ARE FAILING.

* When using the DataIntegrity (checksums) Oracle Advanced Security feature with
  Oracle 12c, the server may unexpectedly drop the connection with the driver.
  This server issue has been reported to Oracle: SR 3-7971196511.

* The newPassword connection property is only supported when connecting to an
  Oracle server earlier than Oracle 12c. If the newPassword connection property
  is specified when attempting to connect to an Oracle 12c server, the driver
  throws a “No Matching Authentication Protocol” exception.

Using Bulk Load with PosgreSQL and Greenplum
--------------------------------------------
If the driver throws the error "The specified connection object is not valid for
creation of a bulk load object" while you are attempting to use the DataDirect
Bulk Load API, ensure that postgresql.jar (or greenplum.jar) is listed before
any other DataDirect drivers on your classpath.

JAXB API for Salesforce with Java SE 6
--------------------------------------
The Salesforce driver uses the Java Architecture for XML Binding (JAXB) 2.2 API.
Some older versions of Java SE 6 use a version of the JAXB API that is
incompatible with that used by the Salesforce driver. If you receive the
following exception, update your JVM to the latest version: 

  JAXB 2.1 API is being loaded from the bootstrap classloader, but this
  RI (from <driver location>) needs 2.2 API. 

If for some reason updating to the latest version is not possible, you can
override the JAXB jar file in your JVM with a compatible JAXB jar file. You can
download the latest JAXB API jar file from http://jaxb.java.net.

You can override the JAXB jar file in your JVM using either of the following
methods:

* Copy the downloaded JAXB jar file to the endorsed directory as described in 
  http://docs.oracle.com/javase/6/docs/technotes/guides/standards/index.html.

* Add the downloaded JAXB jar file to the boot classpath when launching your
  application using the JVM argument:

  -Xbootclasspath/p:<jaxb_jar_file>

  where jaxb_jar_file is the path and filename of the JAXB jar file you
  downloaded.

  For example, if the following command is used to launch your application:

  java MyApp arg1 arg2

  You can modify that command to: 
	
  Windows Example:

  java -Xbootclasspath/p:C:\jaxb\jaxb-api-2.2.3.jar MyApp arg1 arg2

  UNIX/Linux Example:

  java -Xbootclasspath/p:/usr/lib/jaxb/jaxb-api-2.2.3.jar MyApp arg1 arg2

Using the SELECT...INTO Statement with the Salesforce Driver
------------------------------------------------------------
The SELECT...INTO statement is supported for local tables only. The source and
destination tables must both be local tables. Creating remote tables in
Salesforce or loading from remote Salesforce tables using SELECT…INTO is not
supported. Additionally, the option to create the destination table as a
temporary table does not currently work.

Stored Procedures Not Supported for Database.com
------------------------------------------------
The Salesforce driver incorrectly reports that it supports stored procedures for
Database.com (for example, using DatabaseMetadata.supportsStoredProcedures()).
Stored procedures for Database.com are not supported.

Using Bulk Load with Oracle
---------------------------
For the best performance when using the bulk load protocol against Oracle, an
application can specify "enableBulkLoad=true" and perform its batches of
parameterized inserts within a manual transaction. Using the bulk load protocol
can impact the behavior of the driver. The application should do nothing else
within the transaction. If another operation is performed BEFORE the inserts,
the driver is unable to use the bulk load protocol and will choose a different 
approach. If some other "execute" is performed AFTER the inserts, the driver
throws the following exception:

   An execute operation is not allowed at this time, due to unfinished 
   bulk loads. Please perform a "commit" or "rollback".

Using Bulk Load with Microsoft SQL Server 2000 and Higher
---------------------------------------------------------
For optimal performance, minimal logging and table locking must be enabled.
Refer to the following Web sites for more information about enabling minimal
logging:

http://msdn.microsoft.com/en-us/library/ms190422.aspx
http://msdn.microsoft.com/en-us/library/ms190203.aspx

Table locking, a bulk load option, is enabled by default. Table locking prevents
other transactions from accessing the table you are loading to during the bulk
load. See the description of the BulkLoadOptions connection property in the
DATADIRECT CONNECT FOR JDBC USER'S GUIDE for information about enabling and
disabling bulk load options.

Starting the Performance Tuning Wizard
--------------------------------------
* When starting the Performance Tuning Wizard, security features set in your
  browser can prevent the Performance Wizard from launching. A security warning
  message is displayed. Often, the warning message provides instructions for
  unblocking the Performance Wizard for the current session. To allow the
  Performance Wizard to launch without encountering a security warning message,
  the security settings in your browser can be modified. Check with your system
  administrator before disabling any security features.

* The Performance Wizard does not automatically launch from the installer when
  the installer is run on the Macintosh operating system. You can start the
  Performance Wizard by executing the install_dir/wizards/index.html file.

Executing Scripts (for UNIX Users)
----------------------------------
If you receive an error message when executing any DataDirect Connect for JDBC
shell script, make sure that the file has EXECUTE permission. To do this, use
the chmod command. For example, to grant EXECUTE permission to the
testforjdbc.sh file, change to the directory containing testforjdbc.sh and
enter:

chmod +x testforjdbc.sh

Distributed Transactions Using JTA
----------------------------------
If you are using JTA for distributed transactions, you may encounter 
problems when performing certain operations, as shown in the following 
examples:

SQL SERVER 7

1. Problem: SQL Server 7 does not allow resource sharing because it cannot
release the connection to a transaction until it commits or rolls back.

  xaResource.start(xid1, TMNOFLAGS)
  ...
  xaResource.end(xid1, TMSUCCESS)
  xaResource.start(xid2, TMNOFLAGS) ---> fail
 
2. Problem:  Table2 insert rolls back. It should not roll back because it is
outside of the transaction scope.

  xaResource.start(xid1, TMNOFLAGS)
  stmt.executeUpdate("insert into table1 values (1)");
  xaResource.end(xid1, TMSUCCESS)
 
  stmt.executeUpdate("insert into table2 values (2)");
 
  xaResource.prepare(xid1);
  xaResource.rollback(xid1);
 
SQL SERVER 7 and SQL SERVER 2000

1. Problem: Recover should not return xid1 because it is not yet prepared.

  xaResource.start(xid1, TMNOFLAGS)
  xaResource.recover(TMSTARTRSCAN) ---> returns xid1 transaction

This problem has been resolved in DTC patch QFE28, fix number winse#47009,
"In-doubt transactions are not correct removed from the in-doubt transactions
list".

This Microsoft issue is documented at
http://support.microsoft.com/default.aspx?scid=kb;en-us;828748.

All Drivers
-----------
* The DataDirect Connect Series for JDBC drivers allow PreparedStatement.setXXX
  methods and ResultSet.getXXX methods on Blob/Clob data types, in addition to
  the functionality described in the JDBC specification. The supported
  conversions typically are the same as those for LONGVARBINARY/LONGVARCHAR,
  except where limited by database support.

* Calling CallableStatement.registerOutputParameter(parameterIndex, sqlType)
  with sqlType Types.NUMERIC or Types.DECIMAL sets the scale of the output
  parameter to zero (0). According to the JDBC specification, calling 
  CallableStatement.registerOutputParameter(parameterIndex, sqlType, scale) is
  the recommended method for registering NUMERIC or DECIMAL output parameters. 

* When attempting to create an updatable, scroll-sensitive result set for a
  query that contains an expression as one of the columns, the driver cannot
  satisfy the scroll-sensitive request. The driver downgrades the type of the
  result returned to scroll-insensitive.

* The DataDirect Connect Series for JDBC drivers support retrieval of output
  parameters from a stored procedure before all result sets and/or update counts
  have been completely processed. When CallableStatement.getXXX is called,
  result sets and update counts that have not yet been processed by the
  application are discarded to make the output parameter data available.
  Warnings are generated when results are discarded.

* The preferred method for executing a stored procedure that generates result
  sets and update counts is using CallableStatement.execute(). If multiple
  results are generated using executeUpdate, the first update count is returned.
  Any result sets prior to the first update count are discarded. If multiple
  results are generated using executeQuery, the first result set is returned.
  Any update counts prior to the first result set are discarded. Warnings are
  generated when result sets or update counts are discarded. 

* The ResultSet methods getTimestamp(), getDate(), and getTime() return
  references to mutable objects. If the object reference returned from any of
  these methods is modified, re-fetching the column using the same method
  returns the modified value. The value is only modified in memory; the database
  value is not modified.

Driver for Apache Hive
----------------------
The following are notes, known problems, and restrictions with the driver.
These restrictions are based on using Apache Hive 0.9; other versions of
Apache Hive will have their own restrictions. You may find a more complete
listing of Apache Hive known issues and limitations for your version of
Apache Hive in the Apache Hive user documentation here:
https://cwiki.apache.org/confluence/display/Hive/Home

* OLTP Workloads
  - Note that Apache Hive is not designed for OLTP workloads and does not offer
    real-time queries or row-level updates. Apache Hive is instead designed for
    batch type jobs over large data sets with high latency.

* Known Issues for Apache Hive
  - No support for row-level updates or deletes
  - No difference exists between "NULL" and null values 
  - For HiveServer1 connections, no support for user-level authentication
  - For HiveServer1 connections, no support for canceling a running query
  - For HiveServer1 connections, no support for multiple simultaneous 
    connections per port

* HiveQL
  - Apache Hive uses HiveQL, which provides much of the functionality of SQL,
    but has some limitation syntax differences. For more information, refer to
    the Hive Language Manual at 
    https://cwiki.apache.org/confluence/display/Hive/LanguageManual.
    - A single quote within a string literal must be escaped using a \ instead
      of using a single quote.
    - Numeric values specified in scientific notation are not supported in
      Hive 0.8.0.
    - Apache Hive supports UNION ALL statements only when embedded in a
      subquery. For example:
        SELECT * FROM (SELECT integercol FROM itable UNION ALL 
        SELECT integercol FROM gtable2) result_table
    - Subqueries are supported, but they can only exist in the From clause.  
    - Join support is limited to equi-joins.

* Transactions
  - Apache Hive does not support transactions, and by default, the Driver for
    Apache Hive reports that transactions are not supported. However, some
    applications will not operate with a driver that reports transactions are
    not supported. The TransactionMode connection property allows you to
    configure the driver to report that it supports transactions. In this mode,
    the driver ignores requests to enter manual commit mode, start a
    transaction, or commit a transaction and return success. Requests to
    rollback a transaction return an error regardless of the transaction mode
    specified.

DB2 Driver
----------
* Unlike previous versions of the DB2 driver, the 5.1.0 version of the driver 
  does not buffer the input stream if a parameter is a BLOB, CLOB, or DBCLOB
  type.

* The ResultSetMetaData.getObject method returns a Long object instead of a
  BigDecimal object when called on BIGINT columns. In versions previous to
  DataDirect Connect Series for JDBC 3.5, the DataDirect Connect for JDBC DB2
  driver returned a BigDecimal object.

* Scroll-sensitive result sets are not supported. Requests for scroll-sensitive
  result sets are downgraded to scroll-insensitive result sets when possible.
  When this happens, a warning is generated.

* The DB2 driver must be able to determine the data type of the column or stored
  procedure argument to implicitly convert the parameter value. Not all DB2
  database versions support getting parameter metadata for prepared statements.
  Implicit conversions are not supported for database versions that do not
  provide parameter metadata for prepared statements.

Oracle Driver
-------------
* For database versions prior to Oracle 12c, the newPassword connection property
  is supported only when connecting to servers for which the
  ALLOWED_LOGON_VERSION parameter is either not specified or is specified with a
  value of 8. If the newPassword connection property is specified when
  attempting to connect to an Oracle server for which the ALLOWED_LOGON_VERSION
  parameter is specified with a value greater than 8, the driver throws a
  “No Matching Authentication Protocol” exception. The newPassword connection
  property is not supported for Oracle 12c.

* When connecting to Oracle instances running in restricted mode using a
  tnsnames.ora file, you must connect using a service name instead of a SID.

* If using Select failover and a result set contains LOBs, the driver cannot
  recover work in progress for the last Select statement for that result set.
  You must explicitly restart the Select statement if a failover occurs. The
  driver will successfully recover work in progress for any result sets that
  do not contain LOBs.

* If you install the Oracle driver and want to take advantage of JDBC
  distributed transactions through JTA, you must install Oracle8i R3 (8.1.7)
  or higher.

* Because JDBC does not support a cursor data type, the Oracle driver returns
  REF CURSOR output parameters to the application as result sets. For details
  about using REF CURSOR output parameters with the driver, refer to the
  DATADIRECT CONNECT SERIES FOR JDBC USER'S GUIDE.

* By default, values for TIMESTAMP WITH TIME ZONE columns cannot be retrieved
  using the ResultSet.getTimestamp() method because the time zone information is
  lost. The Oracle driver returns NULL when the getTimestamp() method is called
  on a TIMESTAMP WITH TIME ZONE column and generates an exception. For details
  about using the TIMESTAMP WITH TIME ZONE data type with the driver, refer to
  the DATADIRECT CONNECT SERIES FOR JDBC USER'S GUIDE. 

* The Oracle driver describes columns defined as FLOAT or FLOAT(n) as a
  DOUBLE SQL type. Previous to DataDirect Connect Series for JDBC 3.5, the
  driver described these columns as a FLOAT SQL type. Both the DOUBLE type and
  the FLOAT type represent a double precision floating point number. This change
  provides consistent functionality with the DataDirect Connect Series for ODBC
  Oracle driver. The TYPE_NAME field that describes the type name on the Oracle
  database server was changed from number to float to better describe how the
  column was created.

SQL Server Driver
-----------------
* Microsoft SQL Server 7 and SQL Server 2000 Only: Although the SQL Server
  driver fully supports the auto-generated keys feature as described in the
  Microsoft SQL Server chapter of the DATADIRECT CONNECT FOR JDBC USER'S GUIDE,
  some third-party products provide an implementation that, regardless of the
  column name specified, cause the driver to return the value of the identity
  column for the following methods:

  Connection.prepareStatement(String sql, int[] columnIndexes)
  Connection.prepareStatement(String sql, String[] columnNames) 

  Statement.execute(String sql, int[] columnIndexes)
  Statement.execute(String sql, String[] columnNames)

  Statement.executeUpdate(String sql, int[] columnIndexes)
  Statement.executeUpdate(String sql, String[] columnNames)

  To workaround this problem, set the WorkArounds connection property to 1. When
  Workarounds=1, calling any of the auto-generated keys methods listed above
  returns the value of the identity column regardless of the name or index of
  the column specified to the method. If multiple names or indexes are
  specified, the driver throws an exception indicating that multiple column
  names or indexes cannot be specified if connected to Microsoft SQL Server 7 or 
  SQL Server 2000.

* In some cases, when using Kerberos authentication, Windows XP and
  Windows Server 2003 clients appear to use NTLM instead of Kerberos to
  authenticate the user with the domain controller. In these cases, user
  credentials are not stored in the local ticket cache and cannot be obtained
  by the SQL Server driver, causing the Windows Authentication login to fail.
  This is caused by a known problem in the Sun 1.4.x JVM. As a workaround, the
  "os.name" system property can be set to "Windows 2000" when running on a
  Windows XP or Windows Server 2003 machine. For example:

  Dos.name="Windows 2000"

* To ensure correct handling of character parameters, install
  Microsoft SQL Server 7 Service Pack 2 or higher.

* Because of the way CHAR, VARCHAR, and LONGVARCHAR data types are handled
  internally by the driver, parameters of these data types exceeding 4000
  characters in length cannot be compared or sorted, except when using the
  IS NULL or LIKE operators.

Documentation Errata in USER'S GUIDE
------------------------------------
* The "Data Types" and "Using Extended Data Types" topics incorrectly
  indicate support for Oracle 12c extended capabilities for the VARCHAR2,
  NVARCHAR2, and RAW data types. Extended capabilities for these data types are
  not currently supported.

Documentation Errata in REFERENCE
---------------------------------
* The "Oracle Driver" topic under the "getTypeInfo()" section incorrectly
  indicates support for Oracle 12c extended capabilities for the VARCHAR2,
  NVARCHAR2, and RAW data types. Extended capabilities for these data types are
  not currently supported.

Help System Compatibility
-------------------------
* When viewing the installed help system, please note that Google Chrome 
  version 45 is not yet fully supported. When using Google Chrome 45,  the table
  of contents does not synchronize with the pages when using the Next and
  Previous buttons to page through the help system, and the Next and Previous 
  buttons appear inactive. To avoid this issue, you can view the installed help 
  with a certified browser or use the online version of the help: 
  http://media.datadirect.com/download/docs/jdbc/alljdbc/help.html

  The certified web browsers and versions for using this help system are: 
  - Google Chrome 44.x and earlier
  - Internet Explorer 7.x, 8.x, 9.x, 10.x, 11.x
  - Firefox 3.x - 39.x
  - Safari 5.x

* Internet Explorer with the Google Toolbar installed sometimes displays the
  following error when the browser is closed: "An error has occurred in the
  script on this page." This is a known issue with the Google Toolbar and has
  been reported to Google. When closing the driver's help system, this error may
  display.


     Using the Documents

The DataDirect Connect Series for JDBC guides are provided in PDF and HTML.

The HTML help system is installed in the help subdirectory of your product
installation directory.

The PDF and HTML versions of the guides, including the HTML help system,
are available on: http://www.progress.com/resources/documentation

You can view the PDF versions using Adobe Reader. To download Adobe Reader,
visit the Adobe Web site: http://www.adobe.com.


     DataDirect Connect Series for JDBC Files

When you extract the contents of the installation download package to your
installer directory, you will notice the following files that are required to
install DataDirect Connect Series for JDBC: 

Windows:     PROGRESS_DATADIRECT_CONNECT_JDBC_5.1.4_WIN.zip
Non-Windows: PROGRESS_DATADIRECT_CONNECT_JDBC_5_1_4.jar

When you install DataDirect Connect Series for JDBC, the installer creates the
following directories and files in the product installation directory (as
determined by the user), represented by INSTALL_DIR.


INSTALL_DIR:
------------
BrandingTool.jar           Standalone branding tool (OEM installs only)

BuildAdapters.jar          File used to create resource adapters

DDProcInfo.exe             Windows executable to start the Processor Information
                           Utility

DDProcInfo                 UNIX/Linux script to start the Processor Information
                           Utility

fixes.txt                  File describing fixes

jdbcreadme.txt             This file

LicenseTool.jar            File required to extend an evaluation installation

NOTICES.txt                Third-Party vendor license agreements


INSTALL_DIR/DB2/bind:
---------------------
iSeries/*.*                Files for explicitly creating DB2 packages on
                           DB2 for i

LUW/*.*                    Files for explicitly creating DB2 packages on
                           Linux/UNIX/Windows

zOS/*.*                    Files for explicitly creating DB2 packages on z/OS


INSTALL_DIR/Examples/bulk:
--------------------------
Load From File/bulkLoadFileDemo.java
                           Java source example for bulk loading from a
                           CSV file

Load From File/load.txt    Sample data for the example

Streaming/bulkLoadStreamingDemo.java	
                           Java source example for bulk loading from a 
                           result set

Threaded Streaming/bulkLoadThreadedStreamingDemo.java	
                           Java source example for multi-threaded bulk 
                           loading from a result set

Threaded Streaming/README.txt
                           Instructions on how to use the 
                           thread.properties file

Threaded Streaming/thread.properties
                           Properties file for the example  


INSTALL_DIR/Examples/connector:
-------------------------------
ConnectorSample.ear        J2EE Application Enterprise Archive file 
                           containing the ConnectorSample application
 
connectorsample.htm        "Using DataDirect Connect for JDBC Resource 
                           Adapters" document

graphics/*.*               Images referenced by the "Using DataDirect 
                           Connect for JDBC Resource Adapters" document
 
src/ConnectorSample.jsp    Source for the JavaServer Page used to access 
                           the ConnectorSample application

src/connectorsample/ConnectorSample.java     
                           Java source file defining the remote 
                           interface for the ConnectorSample EJB

src/connectorsample/ConnectorSampleBean.java  
                           Java source file defining the home interface 
                           for the ConnectorSample EJB

src/connectorsample/ConnectorSampleHome.java  
                           Java source file containing the 
                           implementation for the ConnectorSample EJB


INSTALL_DIR/Examples/JNDI:
--------------------------
JNDI_FILESYSTEM_Example.java
                           Example Java(TM) source file

JNDI_LDAP_Example.java     Example Java source file


INSTALL_DIR/Examples/SforceSamples:
-----------------------------------
buildsamples.bat           Batch file to build the Salesforce example

buildsamples.sh            Shell script to build the Salesforce example 

ddlogging.properties       Logging properties file

runsalesforceconnectsample.bat
                           Batch file to run the Salesforce example

runsalesforceconnectsample.sh
                           Shell script to run the Salesforce example

bin/com/ddtek/jdbc/samples/SalesforceConnectSample.class
                           Java example class

bin/com/ddtek/jdbc/samples/SampleException.class
                           Java example class

src/com/ddtek/jdbc/samples/SalesforceConnectSample.java
                           Java source example

                 
INSTALL_DIR/Help: 
-----------------
index.html                 HTML help system entry file
/*                         Support files and folders for the HTML help system 

INSTALL_DIR/lib:
----------------
db2.jar                    DB2 Driver and DataSource classes

db2.rar                    DB2 resource archive file

greenplum.jar              Greenplum Driver and DataSource classes

hive.jar                   Driver for Apache Hive and DataSource classes

informix.jar               Informix Driver and DataSource classes

informix.rar               Informix resource archive file

mysql.jar                  MySQL Driver and DataSource classes

mysql.rar                  MySQL resource archive file

openedgewp.jar             Progress OpenEdge Driver and DataSource classes

oracle.jar                 Oracle Driver and DataSource classes

oracle.rar                 Oracle resource archive file

postgresql.jar             PostgreSQL Driver and DataSource classes

sforce.jar                 Salesforce Driver and DataSource classes

sqlserver.jar              SQL Server Driver and DataSource classes

sqlserver.rar              SQL Server resource archive file

sybase.jar                 Sybase Driver and DataSource classes

sybase.rar                 Sybase resource archive file

db2packagemanager.jar      DataDirect DB2 Package Manager jar file
 
DDJDBCAuthxx.dll           Windows DLL that provides support for NTLM
                           authentication (32-bit), where xx is the Build
                           number of the DLL

DDJDBC64Authxx.dll         Windows DLL that provides support for NTLM
                           authentication (Itanium 64-bit), where xx is the
                           Build number of the DLL

DDJDBCx64Authxx.dll        Windows DLL that provides support for NTLM
                           authentication (AMD64 and Intel EM64T 64-bit), where
                           xx is the Build number of the DLL

DB2PackageManager.bat      Batch file to start the DataDirect DB2 Package
                           Manager

DB2PackageManager.sh       Shell script to start the DataDirect DB2 Package
                           Manager

JDBCDriver.policy          Security policy file listing permissions that must be
                           granted to the driver to use Kerberos authentication
                           with a Security Manager

JDBCDriverLogin.conf       Configuration file that instructs the driver to use
                           the Kerberos login module for authentication

krb5.conf                  Kerberos configuration file


INSTALL_DIR/lib/JCA/META-INF:
-----------------------------
db2.xml                    DB2 resource adapter deployment descriptor

informix.xml               Informix resource adapter deployment descriptor

mysql.xml                  MySQL resource adapter deployment descriptor

oracle.xml                 Oracle resource adapter deployment descriptor

sqlserver.xml              SQL Server resource adapter deployment descriptor

sybase.xml                 Sybase resource adapter deployment descriptor

MANIFEST.MF                Manifest file


INSTALL_DIR/pool manager:
-------------------------
pool.jar                   All DataDirect Connection Pool Manager classes


INSTALL_DIR/SQLServer JTA/32-bit:
---------------------------------
instjdbc.sql               File for installing JTA stored procedures

sqljdbc.dll                File for use with JTA stored procedures
                           (32-bit version)


INSTALL_DIR/SQLServer JTA/64-bit:
---------------------------------
instjdbc.sql               File for installing JTA stored procedures

sqljdbc.dll                File for use with JTA stored procedures 
                           (Itanium 64-bit version)


INSTALL_DIR/SQLServer JTA/x64-bit:
----------------------------------
instjdbc.sql               File for installing JTA stored procedures

sqljdbc.dll                File for use with JTA stored Procedures 
                           (AMD64 and Intel EM64T 64-bit version)


INSTALL_DIR/testforjdbc:
------------------------
Config.txt                 Configuration file for DataDirect Test 

ddlogging.properties       Logging properties file

testforjdbc.bat            Batch file to start DataDirect Test

testforjdbc.sh             Shell script to start DataDirect Test

lib/testforjdbc.jar        DataDirect Test classes


INSTALL_DIR/UninstallerData:
----------------------------
resource/*.*               Resource files for the Windows  
                           uninstaller

.com.zerog.registry.xml    Support file for the uninstaller

InstallScript.iap_xml      Support file for the uninstaller

installvariables.properties
                           Support file for the Windows uninstaller

Uninstall Progress DataDirect Connect (R) and Connect XE for JDBC 5.1 SP4.exe
                           Windows uninstaller

Uninstall Progress DataDirect Connect (R) and Connect XE for JDBC 5.1 SP4.lax
                           Support file for the Windows uninstaller

uninstaller.jar            Java uninstaller


INSTALL_DIR/UninstallerData/Logs:
---------------------------------
Progress_DataDirect_Connect_(R)_for_JDBC_5.1_SP4_InstallLog.log
                           Log file created by the Windows installer


INSTALL_DIR/wizards:
--------------------
index.html                 HTML file to launch the Performance Tuning Wizard
                           applet

JDBCPerf.jar               Jar file containing the classes for the Performance
                           Tuning Wizard applet

images/*.*                 Graphic files used by the Performance Tuning 
                           Wizard applet



14 February 2019
===============
End of README