4 Replies Latest reply on Oct 27, 2016 10:38 AM by Steven Hawkins

    need help on bulk loading queries to slide underneath daily CPU limit

    Omar Khan Novice

      Hi,

       

      This is going to sound wierd, but today I ran into a hard quota limit on our Teradata server of only so much CPU usage daily. The number is global and set for all.

       

      What was happening was that the queries going to Teradata from Teiid were cumulatively adding up. This was occuring when performing a large federated query between postgresql and Teradata, even individually each query was not at all complex coming out of the query planner from Teiid. Teradata counted them all in a daily cpu usage total.

       

      The DBA recommended doing a bulk insert using a custom fastload utility to upload an exported csv from our local materialized view from postgresql to a temporary table in Teradata. This would move the various dependent joins and IN clauses to the server instead of being passed from Teiid in IN clauses. I overrode the translator defaults to 5 dependent joins and 100 within the IN clause, the server would hang with the default 50 and 1000.

       

      Does teiid have a bulkload facility that we could leverage to insert to a temp table?  I saw max insert statements on the translator properties. Would we need to write a Java program to do this (read something around max prepared statements), or is this something we could accomplish in SQL?

       

      If we don't use the fastload utility, we probably will still unfortunately have to come underneath the daily cpu usage, so I think we would need to also leverage only a certain number of rows at a time.

        • 1. Re: need help on bulk loading queries to slide underneath daily CPU limit
          Steven Hawkins Master

          > the server would hang with the default 50 and 1000

           

          Which server would hang?  Do you have a threaddump or other information to confirm what you were seeing?

           

          > Does teiid have a bulkload facility that we could leverage to insert to a temp table?  I saw max insert statements on the translator properties. Would we need to write a Java program to do this (read something around max prepared statements), or is this something we could accomplish in SQL?

           

          Teiid does handle insert into with query expressions - insert into tbl select ...

           

          This will load the target table using prepared insert statement batches and may not have the same performance as a specific bulk load facility.

           

           

          • 2. Re: need help on bulk loading queries to slide underneath daily CPU limit
            Omar Khan Novice

            Hi Steven,

             

            The server that failed was Teradata, it failed in two ways.... Unfortunately, I don't have a thread dump.

             

            (again I was able to get around this by limiting the IN clause to 100 instead of the 1000 default, and 5 dependent instead of 50 dependent joins ; not sure if that helped besides the IN changes, but I did set that too and it seem to work)

             

            1. The query generated by Teiid when I cut/copy/pasted it to test simply hung the SQL Assistant, when trying to execute it. It was really long. In fact when I simply try to paste it into the editor here, it will crash chrome.

            2.  For the second way.  A different error was produced from Teradata - I had to redact this:

             

            2016-10-23 18:19:15,555 WARN  [org.teiid.CONNECTOR] (Worker19_QueryProcessorQueue144) LqnGMFsLQlNX Connector worker process failed for atomic-request=LqnGMFsLQlNX.0.78.39: org.teiid.translator.jdbc.JDBCExecutionException: 3919 TEIID11008:TEIID11004 Error executing statement(s): [Prepared Values: [] SQL: SELECT g_0."A" AS c_0, TRIM(TRAILING FROM g_0."B") AS c_1, g_0."C" AS c_2, TRIM(TRAILING FROM g_1."D") AS c_3, g_1."E" AS c_4, TRIM(TRAILING FROM g_1."F") AS c_5, TRIM(TRAILING FROM g_2."G") AS c_6, TRIM(TRAILING FROM g_2."H") AS c_7, g_1."I" AS c_8, TRIM(TRAILING FROM g_1."J") AS c_9, g_1."K" AS c_10, g_1."K" AS c_11, TRIM(TRAILING FROM g_0."L") AS c_12, g_0."M" AS c_13, g_0."N" AS c_14, g_0."O" AS c_15, g_0."P" AS c_16 FROM "Q"."R" AS g_0, "Q"."S" AS g_1, "Q"."T" AS g_2 WHERE g_1."E" = g_2."E" AND TRIM(TRAILING FROM g_1."F") = TRIM(TRAILING FROM g_2."F") AND TRIM(TRAILING FROM g_1."D") = TRIM(TRAILING FROM g_2."D") AND g_1."E" = g_0."E" AND TRIM(TRAILING FROM g_1."F") = TRIM(TRAILING FROM g_0."F") AND TRIM(TRAILING FROM g_1."D") = TRIM(TRAILING FROM g_0."D") AND g_0."A" IN  <clause continues with various ANDs for 600000 characters>

            Caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.10.00.22] [Error 3919] [SQLState HY000] Table has too many columns.

              at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:308)

              at com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:103)

              at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:311)

              at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:200)

              at com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:137)

              at com.teradata.jdbc.jdbc_4.statemachine.StatementController.run(StatementController.java:128)

              at com.teradata.jdbc.jdbc_4.TDStatement.executeStatement(TDStatement.java:389)

              at com.teradata.jdbc.jdbc_4.TDStatement.prepareRequest(TDStatement.java:576)

              at com.teradata.jdbc.jdbc_4.TDPreparedStatement.<init>(TDPreparedStatement.java:130)

              at com.teradata.jdbc.jdk6.JDK6_SQL_PreparedStatement.<init>(JDK6_SQL_PreparedStatement.java:30)

              at com.teradata.jdbc.jdk6.JDK6_SQL_Connection.constructPreparedStatement(JDK6_SQL_Connection.java:81)

              at com.teradata.jdbc.jdbc_4.TDSession.prepareStatement(TDSession.java:1325)

              at com.teradata.jdbc.jdbc_4.TDSession.prepareStatement(TDSession.java:1369)

              at org.jboss.jca.adapters.jdbc.BaseWrapperManagedConnection.doPrepareStatement(BaseWrapperManagedConnection.java:782)

              at org.jboss.jca.adapters.jdbc.BaseWrapperManagedConnection.prepareStatement(BaseWrapperManagedConnection.java:768)

              at org.jboss.jca.adapters.jdbc.WrappedConnection.prepareStatement(WrappedConnection.java:454)

              at org.teiid.translator.jdbc.JDBCBaseExecution.getPreparedStatement(JDBCBaseExecution.java:196)

              at org.teiid.translator.jdbc.JDBCQueryExecution.execute(JDBCQueryExecution.java:121)

              ... 64 more

             

             

            Thanks for the tip regarding the insert!!!

            I'm less concerned if it won't be as fast, I'm more concerned about having a solid DV approach without having to leverage all of the native bells and whistles of the RDBMS.

            • 3. Re: need help on bulk loading queries to slide underneath daily CPU limit
              Omar Khan Novice

              Steven - bulk loading from an insert in Teiid to a temp table in Teradata helped considerably. It only took a couple of minutes to get everything that was in the IN clause over to the server, then the server was able to work on it more productively and quicker.

               

              Thanks for your help!

              • 4. Re: need help on bulk loading queries to slide underneath daily CPU limit
                Steven Hawkins Master

                Glad to hear it's working for you.  I'll look to see if the default settings for the Teradata translator need changed to prevent the exception you are showing above.