Skip to end of metadata
Go to start of metadata

Phoenix Data Sources

The following is a example for setting up Phoenix Data Sources, which is precondition for Apache HBase Translator. In addition to the Data Sources set up, this article also cover mapping Phoenix table to an existing HBase table and creating a new Phoenix table.

There are configuration templates for Phoenix data sources in the "<jboss-install>/docs/teiid/datasources" directory. A complete description how a data source can be added into JBoss AS7.x is also described here.

Configuring a Phoenix data source in JBoss/WildFly (JBoss AS 7.x or later)

Configuring a Phoenix data source is nearly identical to configuring JDBC Data Sources. The first step is deploying the Phoenix driver jar. Using below CLI command to deploy Phoenix driver:

The Driver jar can be download from phoenix document.

The second steps is creating the Data Source base on above deployed driver, which is also like creating JDBC Data Source. Using below CLI command to create Data Source:

Please make sure the URL, Driver, and other properties are configured correctly:

  • jndi-name - The JNDI name need to match the JNDI name you used in VDB
  • driver-name - The Driver name need to match the driver you deployed in above steps
  • connection-url - The URL need to match the HBase zookeeper quorum server, the format like jdbc:phoenix [ :<zookeeper quorum> [ :<port number> ] [ :<root node> ] ], 'jdbc:phoenix:127.0.0.1:2181' is a example
  • user-name/password - The user credentials for Phoenix Connection
The Phoenix Connection AutoCommit default is false. Set phoenix.connection.autoCommit  to true if you will be executing INSERT/UPDATE/DELETE statements against Phoenix.

Mapping Phoenix table to an existing HBase table

Mapping Phoenix table to an existing HBase table has 2 steps. The first step is installing phoenix-[version]-server.jar to the classpath of every HBase region server. An easy way to do this is to copy it into the HBase lib - for more details please refer to the phoenix documentation.

The second step is executing the DDL to map a Phoenix table to an existing HBase table. The DDL can either be executed via Phoenix Command Line, or executed by JDBC.

The Following is a example for mapping an existing HBase Customer with the following structure:


As depicted above, the HBase Customer table have 2 column families, customer and sales, and each has 2 column qualifiers, name, city, product and amount respectively. We can map this Table to Phoenix via DDL:

For more about mapping Phoenix table to an existing HBase table please refer to the phoenix documentation.

Creating a new Phoenix table

Creating a new Phoenix table is just like mapping to an existing HBase table. Phoenix will create any metadata (table, column families) that do not exist. Similar to the above example the DDL to create the Phoenix/HBase Customer table would be:

Defining Foreign Table in VDB

Finally, we need define a Foreign Table in VDB that map to Phoenix table, the following principles should be considered in defining Foreign Table:

  • nameinsource option in Table used to match Phoenix table name
  • nameinsource option in Column used to match HBase Table's Columns
  • create a primary key is recommended, the primary key column should match Phoenix table's primary key/HBase row id.

With "Mapping Phoenix table to an existing HBase table" section's 'Customer' table, below is a example:

"Constraint violation. X may not be null" exception may thrown if updating a table without defining a primary key.
Labels:
None
Enter labels to add to this page:
Please wait 
Looking for a label? Just start typing.
  1. Jan 18, 2015

    I didn't find how to attach a picture, so I add picture via URL. How to attach a image to page?

    1. Jan 20, 2015

      On the top of the editor there is a button to insert an image.

  2. Jan 29, 2015

    I have test these steps, phnenix driver depend on security login classes(com.sun.security.auth.module.UnixLoginModule), so simple deploy driver doesn't work, the following module dependencies should be add:

    • javax.api
    • sun.jdk
    • org.apache.log4j(if this note add, a error log prompt)

    I have changed the doc correspondingly.

  3. Feb 19, 2015

    Hi.

    Thank you for detailed explanation on Phoenix driver deployment! I could add some notes on that though.

    I've tried to install Phoenix driver as module according to this documentation. Phoenix version is 4.3 and WildFly version is 8.2.

    I kept getting exception on datasource connection retrieval:

    I've added dependency on system module "org.apache.commons.logging" and finally datasource worked, so I think you may add that to your article.

    Final version of "module.xml":

  4. Feb 19, 2015

    Teiid does not have support for any version WildFly yet.

  5. Apr 14, 2016

    I used the above information to configure a Phoenix DataSource for a Wildfly 10 instance and I thought I would share my results.

    wildfly-10.0.0.Final/modules/org/apache/phoenix/main/module.xml
    Widlfly Datasource Snippet
    Environment
    Name Version
    Phoenix
    4.4
    Wildfly
    10.0.0.Final
    Ambari
    2.2.1.0
    HDFS
    2.7.1.2.4
    MapReduce2
    2.7.1.2.4
    YARN
    2.7.1.2.4
    HBase
    1.1.2.2.4

    See Also:  HDP 2.4.0 Release Notes