Sunday, July 31, 2016

Redshift~<- Denodo!<-- Tableau!<- TDE -Large Data volume Extracts:

Redshift~<- -large="" data="" denodo="" extracts:="" tableau="" tde="" u="" volume="">


1: Creation of Large Extract in Tableau from Data sources with high volume of data at Low grain have challenges in getting the Extract.

2: Analyze the Table Size and time taken for creation of the extract with the data model design and SQL send to different layers in the Process.

3: Tableau Configuration for the creation of Large Data sets in TDE, TDC. ( timeout, DSN parameters etc..) are critical  to overcome the errors.

4: Tableau Server errors while trying to create  extract ( Communication Protocol, Resource limit Reached.. etc) (7200).

5: Creating Extract via virtual Layer takes more time than direct connection using Native Drivers ( 1:2 ) Ratio.

6: Estimate the size of the table and number of columns and how much time it takes to create TDE ( 25m with 350 columns consumes 90 gb space takes 2 hrs).

7: Overcome the tableau server errors by setting the query limit and time out settings as required by our extract ( change & configure).

8: Tweak the Odbc DSN setting for the right combination for the extract creation after the analysis, ( Timeout, UseDeclareFetch & Cache  Size) (4hrs:6:10000).

9: Out of Memory Error while reading Tuples -- support for CURSORS and Amazon Redshift -  Configure and create TDC file as required.

10: using Native driver and direct connection to RS in creation of extract drops connection @ 907 seconds with Protocol error -- Unknown / open  issue

11:Key Aspects to be considered (Optimal, compression, encoding, Dist Key, sort Key, aggregation & apply filters to make use of the keys)

1 comment:

gocrack said...

Your style is so unique compared to other people I have read stuff from. Many thanks forposting when you have the opportunity, Guess I will just bookmark this site Debut RedShift Render