Sunday, July 31, 2016

Redshift~<- Denodo!<-- Tableau!<- TDE -Large Data volume Extracts:

Redshift~<- -large="" data="" denodo="" extracts:="" tableau="" tde="" u="" volume="">


1: Creation of Large Extract in Tableau from Data sources with high volume of data at Low grain have challenges in getting the Extract.

2: Analyze the Table Size and time taken for creation of the extract with the data model design and SQL send to different layers in the Process.

3: Tableau Configuration for the creation of Large Data sets in TDE, TDC. ( timeout, DSN parameters etc..) are critical  to overcome the errors.

4: Tableau Server errors while trying to create  extract ( Communication Protocol, Resource limit Reached.. etc) (7200).

5: Creating Extract via virtual Layer takes more time than direct connection using Native Drivers ( 1:2 ) Ratio.

6: Estimate the size of the table and number of columns and how much time it takes to create TDE ( 25m with 350 columns consumes 90 gb space takes 2 hrs).

7: Overcome the tableau server errors by setting the query limit and time out settings as required by our extract ( change & configure).

8: Tweak the Odbc DSN setting for the right combination for the extract creation after the analysis, ( Timeout, UseDeclareFetch & Cache  Size) (4hrs:6:10000).

9: Out of Memory Error while reading Tuples -- support for CURSORS and Amazon Redshift -  Configure and create TDC file as required.

10: using Native driver and direct connection to RS in creation of extract drops connection @ 907 seconds with Protocol error -- Unknown / open  issue

11:Key Aspects to be considered (Optimal, compression, encoding, Dist Key, sort Key, aggregation & apply filters to make use of the keys)

Saturday, March 5, 2016

Denodo Server - Setup & configure Cache Database to MySQL:



Denodo Server - Setup & configure Cache Database: - Bottlenecks
Step1:
Install the Denodo Server by clicking the executable and follow the steps with giving inputs and files as required to complete the server install.
There will be 8 steps for the entire process and it will take 10-15 minutes to complete the install

Step2: bottlenecks faced
The Admin tool was giving the connection Error when connecting to the VDP Server due to reasons below           
                      Connection Error -
                      Service on Denodo gets started and stopped immediately
                      Log file shows the license error message
Solution tried was to close all the admin tool of denodo , start the denodo vdp server and scheduler and open the Admin tool to connect to server - it works

Step3 : Denodo Setup activities to import the VQL or create a new data base and data source connection as required ( normal Admin & Developer stuff)

Step 4: Validation of the new database created on denodo and querying the underlying objects to make sure everything is working as expected. Also check the respective denodo Server and scheduler logs respectively to verify the errors.

Step5: by default the Embedded Derby will be configured as cache database for the objects, now we will try to replace with MySQL database for Cache Loading & Performance

Step6 - My SQL - Install and configure database:
Bottleneck. Faced:
                      Prerequisites were not met so ended up in install of Microsfot Dotnet Framwork ( Visual c++ 20003 Kit for install to continue successfully.
                      Command prompt execution of my sql commands to connect, create & configure the cache database with required options

Step7: Configuration of the my sql cache database to Denodo Database Cache options.
Bottleneck faced
                com.jdbc mysql driver error when the new database is configured in denodo
                Could not change the cache parameters- unable to find the jar / driver

Step8: Place the Jar file (my SQL connector JAR) file in both the location of the lib folder in Denodo and My sql it worked and was able to successfully configure the new database

Step9: Configure the cache for the object and Load the cache, invalidate the cache and repeat few iterations to make sure the functionally works and its consistent.

Saturday, December 5, 2015

Data Virtualization Challenges in Reports Using Denodo & Tableau



Tableau 9.1.2 -> Denodo 5.5 -> Amazon Redshift‎- Issues & Errors:

1: Using ODBC as Data Source Name in Tableau Server - Connectivity Issue.

           Data sources from Denodo using DSN Published to Tableau Server

           Tried to create worksheet by using the Tableau server Data source

            Able to query on Dimension not measure columns due to aggregation not supported when    using ODBC Data sources

                Currently the design of published data sources using ODBC connections doesn't allow for aggregations.


2: Using Postgresql to connect directly to Denodo (Data Virtualization)

               Data sources published to tableau server for Denodo using Postgresql as connection info

               Tried to create worksheet by using the Tableau server Data source of Psotgresql

                All dimensions and measure columns works fine as expected 

                But String function error occurs when using this driver ( PostgreSQL- > Denodo -> Redshift‎)  


3:  Using Tableau Amazon Redshift‎ to connect directly to Redshift‎t‎ Data Source

                No issues when directly connecting to Red shift using tableau and query any column and apply filters in worksheet on tableau

OBI 12c A Quick Snap about Installatiion


Installing OBI 12c - Challenges / Issues Occurred in windows:

1: Download all required Software’s for Install ( JDK 8, Web logic 12, Oracle 12c, OBIEE12c) & Extract ll zips to one folder.

2: Install JDK 8.0 and validate the version.

3: Install Web Logic Server by using the Jar file and it will open the OU for installation.

4: Install Oracle Database 12c and make sure the components of disk2 folder are copied to the main folder components (File not found, Error twice due to components not available).

5: All the Zip folder of Disk2 of OBI has to be inside the main folder for OBI 12c Install, also the spaces will be used for extracting..

6: Kick off the Installation of the OBI 12c software.

7: Installation over and run the config.cmd to go over the configuration of BI platform and its components.

Errors came across to complete the 12 installation  ( Web logic - Java Version issue., Oracle DB :File not found due to components missing  OBI 12c Installation doesn’t happen as files are not in the same root folder).

8: Please check the Prerequisites before kicking the OBI 12c installation as it will not work even after the success of the installation and configuration.

9: Tired couple of time to access the analytics url after bouncing  all the services as well as  Rebooting the server machine still the same Error 500 the connection Refused.

Then finally figure out to access the analytic using the admin url and was able o get in and explore the 12c new features and OBI components

Tuesday, November 3, 2015

Quick Snap of Workbook Operation


User Request  ---> Gateway  --->   Vizportal   ---> Vizql  ---> Data Engine   -> File Store  ---> Vizql - ---> Response -> User

(Data  / Extract Refresh)  ---> Background  ---> Process  - > Each Core - > 8gb Ram -- > 4

(Analytics Access / Use -> ) ----->Vizql  ---> Process  -- > Each Core - > 8gb Ram --> 4

Saturday, March 21, 2015

BI - Reporting Tools - Explore:
 
 
ODBC Errors - Application & System Issues (Analytics & Noetix)  Technical Solution, Workaround & Guidance for OBI & Noetix Team
Noetix - Insufficient Privileges - Any User - First Time Access to Shared foldersCheck the AD groups Association and clear the browser cache and
  try.
Analytics: [nQSError:
  96002] Essbase Error: Unknown Member- for the report/ page/ dashboard
Validate the correct member and change the filter as
  required in the report.
Analytics:[nQSError:
  46118] Out of disk space.
Database is down or the OBI Server tmp file size has
  reached its limit
Analytics:[nQSError:
  96002] Essbase: Query is allocating too large memory ( > 4GB)
Put default values to Prompt  to restrict the
  Record set for the MDX sent to obi
Analytics:[nQSError:
  17012] Bulk fetch failed. -Oracle Error code: 1013
Oracle Error - Get the Physical SQL execute in Toad to
  get the Error and fix it
Analytics:[nQSError:
  14050] Internal Error: could not find level detailed enough for GROUP BY
  clause
Its OBI Presentation Error tweak the Aggregation Rule
  and column formula
Analytics:[nQSError:
  14025] No fact table exists at the requested level of detail: 
Fix the Consistency warning in the RPD and deploy
Analytics:[nQSError: 14046] No physical join relationships exist between any of the following tables:Fix the Consistency warning in the RPD and deploy
Analytics:[nQSError:
  14070] Cannot find logical table source coverage for logical columns:
  [Amount].
The logical column mapping for the column is missing in
  the business layer
Analytics:[nQSError:
  15018] Incorrectly defined logical table source 
Check the Business model and apply the fix for warning
  as required - Data model
Analytics:[nQSError:
  16001] ODBC error [DataDirect][ODBC PostgreSQL Wire Protocol driver]Socket
  closed.
Destination Server was down - check the server , port
   of the PostgreSQL
Analytics:[nQSError: 17010]- ORA-01031: insufficient privileges, invalid identifier,  Bulk fetch failed.Check the DB object for the column being present in the
  table
Analytics:[nQSError:
  17014] Could not connect to Oracle database.
Database Down , Tns Error, basic connectivity issue
Analytics:[nQSError:
  23006] The session variable, NQ_SESSION. has no value definition.
Check the default Value and also the underlying SQL in
  the IB
Analytics:[nQSError:
  96002] Essbase Error: Request [MdxReport] terminated since it exceeded the
  time limit
Its configuration set at the essbase admin to kill the
  long running sessions
Analytics:[nQSError:
  46118] Out of disk space.
Check the Memory configuration @ Essbase & Obi
  Server Level to overcome
Analytics: [nQSError:
  46017] Sort has no work space in the work directory.
The Tmp directory set in the obi configuration has
  reached its limits - privilege
Analytics:[nQSError:
  46066] Operation cancelled.
Close Cursor, cancel running request  by
  Administrator
Analytics:[nQSError:
  60009] The user request exceeded the maximum query governing execution time.
Long running MDX, SQL's send to the Data source being
  killed by server
Analytics:[nQSError:
  46073] Operation ''write() tmp dir No such file or directory.
Check the Write privilege to the tmp dir setting in the
  OBI - NQSconfig
Analytics:[nQSError:
  17014] ORA-12154: TNS:could not resolve the connect identifier 
Check the basic setting of connection to Oracle DB
Analytics:[nQSError:
  17010] message: ORA-00910: specified length too long for its datatype at OCI
  call 
check the data type & Size of the column in
  physical layer of RPD and change
Analytics:[nQSError:
  59107] Set operation failed due to incompatible sort order.
validate the union reports and apply the correct sor
  order for the column
Analytics:[nQSError:
  42015] Cannot function ship the following expression: 
Tweak the column formula, apply the aggregation rule,
  get the MDX, Physical SQL
Analytics:[nQSError:
  96002] Essbase Error: Invalid login id - please login again
Check the Essbase server connectivity Details
Analytics:[nQSError:

  96002] Essbase Error: Network errorA descendent member of the expected level
  is returned. 
Check the Essbase hierarchy for the members being
  displayed in the report
Analytics:[nQSError:
  96002] Essbase Error: Network error: Timed out before receiving all data
Essbase Connectivity Issue - Check server, port,
  application, cube
Analytics:[nQSError:
  96002] Essbase Error: Application

 
currently not accepting connections
Either the cube refresh / switch is in progress and obi
  application
Analytics:[nQSError:
  96002] please check if you tried to connect to secure port
Server Config and Validation of the setting based on
  the data source connect
Analytics:[nQSError:
  59011] OverFlow occurred during the execution of AddFloat. 
Tweak the column formula , try conversion, check the
  SQL in Oracle
Analytics:[nQSError:
  96002] Essbase Error: Not enough memory for formula Set MAXFORMULACACHESIZE
Essbase.cfg- Tweak the setting in the Essbase Server
Analytics:[nQSError:
  17010] ORA-00980: synonym translation is no longer valid
Validate or recreate the synonym as required for the
  object used in the report
Analytics:[nQSError:
  46104] The current operation was about to overflow the stack. 
Get the Physical SQL , make it wok in Toad , and
  translate the fix on obi
Analytics:[nQSError:
  17010] ORA-00932: inconsistent datatypes: expected - got CLOB
Convert the  data type in column formula
   apply the size and then try
Analytics:[nQSError:
  22023] An arithmetic operation is being carried out on a non-numeric type.
Validate the column formula and apply the data type
  conversion operations
Analytics:[nQSError:
  22025] Function TimestampAdd is called with an incompatible type.
Validate the column formula and apply the data type
  conversion operations
Analytics:[nQSError:
  23006] The session variable, NQ_SESSION.SRR_QTR, has no value definition.
Check the SQL, default Value in the IB in the RPD of
  OBIEE.
Analytics:[nQSError:
  96002] Essbase Error: Administrator Has Temporarily Disabled User Commands
Essbase, server, application, cube is down or
   under maintenance