您的位置:首页 > 其它

Informatica Powercenter 8.1 笔记

2012-03-28 16:33 411 查看
1. write sql in mapping only, then refresh to workflow.

2. be care of data type and length.

3. naming convention and description standard

4, check the result option when using the lookup transformation.

5, each port in source qualifier should be bounded to the source.

6. CHECK IF there is useless targets or source in the mapping, which can cause the

ERROR : Failed to prepare the task.

And can't get the session log, even though the mapping and workflow are valid.

7. In the lookup transformation, need to take care of the datatype in the db.

Ex : char, varchar

8. The lookup transformation will compare the NULL Value, and retrun the matched record.

9. Confiugre the connection $Source/$Target in the mapping desinger for the lookup trans, to make it be consistent with the session property.

tips

1. lookup override sql -- use 'AS, the number, order of the ports should be as same as the sql statement

2. TARGET LOAD PLAN

performance

1. joiner -- make the smaller file the master.

2. In general unconnected lookups should be used if possible since less data is cached.

3. using Sorted Input in Aggregator and Joiner transformations when input rows > 250,000

dateadd()

convert(datetime, '$$SRC_FLT_DT')

CREATE_DATE > to_date(‘$$mv_INCREMENT_DATE’,’MM-DD-YYYY HH24:MI:SS’)

SETMAXVARIABLE($$mv_INCREMENT_DATE,CREATE_DATE)

FIRST( ITEM_NAME, ITEM_PRICE > 40 )

============================================================================================================

pmcmd abortworkflow -uv USERNAME -pv PASSWORD -sv $INTEGRATION_SERVICE -d $DOMAIN -f US_FL_TMAP_IA_LOAD -nowait wf_TMAP_PARM_FILE_LD

pmcmd startworkflow -uv INFA_USERNAME -pv INFA_PASSWORD -sv ${INFA_INTEGRATION_SERVICE} -d ${INFA_DOMAIN} -f ${FOLDER} -paramfile ${PARAMFILE} ${WAIT_FLAG} ${WORKFLOW_NAME} >> ${log_file}

pmcmd startworkflow -uv USERNAME -pv PASSWORD -sv $INTEGRATION_SERVICE -d $DOMAIN -f $FOLDER $WAIT_FLAG $WORKFLOW_NAME

WAIT_FLAG=-wait

WAIT_FLAG=-nowait

============================================================================================================

<INFA_HOME>/server/bin> sh infacmd.sh ConvertLogFile -in <input_filename> -lo <output_filename>

sh infacmd.sh ConvertLogFile -in /schedapps/tmap/log/s_m_TMAP_PURGE_LOCATION_LOAD_LOG_v1.log.bin -lo /schedapps/tmap/log/s_m_TMAP_PURGE_LOCATION_LOAD_LOG_v1.txt

============================================================================================================

Insert

Update as Update

Update as Insert

Update else Insert

Delete

=====================================================

0 - false

1 - true

=====================================================

INSTR(COMPANY, 'A') -- Return the first occurrence of the letter 'A', start from begin

INSTR(COMPANY, 'A', 1, 2) -- Return the second occurrence of the letter 'A', start from begin

INSTR(COMPANY, 'A', -1, 2) -- Return the second occurrence of the letter 'A', start from end

RETRUN 0 IF NO FOUND

=====================================================

COMPANY='abcd'

SUBSTR(COMPANY, 0, 3) == SUBSTR(COMPANY, 1, 3) = abc

SUBSTR(COMPANY, -2, 1) = c

SUBSTR(COMPANY, -2, 8) = cd

SUBSTR(COMPANY, 5, 3) = ''

=====================================================

the format of datetime stored in DB should match the format of datetime used in transformation.

And it's better to check if the length is zero.

3/28/2011 3:04:12 PM

TO_DATE(LTRIM(RTRIM(inspection_start)), 'MM/DD/YYYY HH12:MI:SS AM')

2011-03-01 06:30:00

to_date(substr(v_message, 0, instr(v_message, '^')-1), 'YYYY-MM-DD HH24:MI:SS')

=====================================================

use aggregator transformation to convert multiple records into one.

FIRST(message, substr(ltrim(rtrim(uniqkey)), 11, 2)='86')

LAST(message)

use union transformation to convert one record into multiple ones.

use variable port to track the value of the previous record.

=====================================================

decode(values,

first_search, first_value

..

[default]

)

v_Pattern=

decode(v_Occurrence + 1,

1, '(.*)',

2, '(.*)\^(.*)',

3, '(.*)\^(.*)\^(.*)',

4, '(.*)\^(.*)\^(.*)\^(.*)',

5, '(.*)\^(.*)\^(.*)\^(.*)\^(.*)')

DECODE(o_Occurrence5 + 1, 1, '(.*)',

2, '(.*)[\'||$$mp_DELIMITER||'](.*)',

3, '(.*)[\'||$$mp_DELIMITER||'](.*)[\'||$$mp_DELIMITER||'](.*)',

4, '(.*)[\'||$$mp_DELIMITER||'](.*)[\'||$$mp_DELIMITER||'](.*)[\'||$$mp_DELIMITER||'](.*)'

)

value2=

reg_extract(Switchdatime, v_Pattern, 2)

=====================================================

lookup transformation

-- need to check the datatype of the table, ex. char, varchar

-- override sql, can use the generate sql function.

=====================================================

IIF(CufOffDate = TRUNC(sysdate, 'D'), ,)

=====================================================

the way of invoking store procedure transformation

:sp.sp_Get_Trip_Num(PROC_RESULT)

define two ports in the sp trans

=====================================================

define the parameter : $$mp_parm

need to use the '$$mp_parm' in the sql of the source qualifer, if it's a string.

can't use the '$$mp_parm' in expressions. need to use it as $$mp_parm directly

=====================================================

$s_m_PLAN_VS_ACTUAL_LEG_LVL_FACT_DPS_ONLY.Status = Succeeded

=====================================================

[GLOBAL]

mplt_Convert_GMT_to_Local.$$mp_INVALID_DT=0001-01-01 00:00:00

[s_m_TMAPDPM_Archive_Old_DataFromWageHistory]

$PMBadFileDir=$PMRootDir/tmapdpm/BadFiles

$PMSessionLogDir=$PMRootDir/tmapdpm/SessLogs

$PMSourceFileDir=$PMRootDir/tmapdpm/SrcFiles

$PMTargetFileDir=$PMRootDir/tmapdpm/TgtFiles

$DBConnectionDRIVERPAYROLLTMAP=TMAP_DPM_PRD

The FIRST() function in aggregator transformation will ignore the null value, unless all inputs have null value.

When the funtion try to add NULL value to another value, it will return NULL.

When a value is compared with NULL, it will return FALSE.
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: