Could you please suggest expression to change SQL SERVER DATE to Salesforce date format
@aabyaz5124Ай бұрын
Hi sir by mistake I have created trial account by two email id with the same password and install agent 64 and secure agent by one email id but when i tried to open Informatica cloud it is not opening. it is showing user name or password is not valid when i tried to reset the password then it is calling me to register account. Please guide me
@itsranjan2003Ай бұрын
@@aabyaz5124 you can create another account using one email id with different user name, then install secure agent again and try.
@aravindkumar9476Ай бұрын
Hii.....how to trigger tommorow files automatically...for example I have a file called 07-11-2024_orders.csv and run this today's file and scheduled for tommorow...and tommorow automatically triggered tommorows file like 08-11-2024_orders.csv...how to solve this..this was asked in an interview
@RiyaDhar-v3u2 ай бұрын
If for the a source for the same value status_id is coming as 'Y' and in the other source status_id coming as 'N' then what will be generated in the target after union ?
@itsranjan20032 ай бұрын
@@RiyaDhar-v3u in that case you will have value Y for one record and value N for other record.
@kirangoud69172 ай бұрын
Hi sir please make videos real time projects how to work can you explain
@anilanche57532 ай бұрын
Hi Ranjan sir Please let me know how to migrate the job from datastage to iics
@itsranjan20032 ай бұрын
@@anilanche5753 hi, you need to redevelop the datastage jobs in iics based on the logic implemeted in datastage.
@anilanche57532 ай бұрын
@@itsranjan2003 we don't have any migration tool ? And can we do export and import options?
@itsranjan20032 ай бұрын
@@anilanche5753 I have not heard about any migration tool. Pls raise a informatica global support ticket, they may suggest if they have any such tool available.
@itsranjan20032 ай бұрын
Export and import will work for powercenter and iics code only..
@anilanche57532 ай бұрын
@@itsranjan2003 okay I will do it sir, in the mean while if you get anything please let me know sir
@ForUBorn2 ай бұрын
Voice is very low...I had to turn on full volume.
@haneef30172 ай бұрын
Iics admin activities pls explain
@crazycom98072 ай бұрын
Thank you sir
@jayanthikasa2443 ай бұрын
very informative, can u pls continue more videos on this topic
@PradeepKanaparthy3 ай бұрын
sir, can you help me with sequence generator. i have mapping with three targets. i want to generate same sequence numbers for three targets. How do i achieve it. i have tried but couldn't get it.
@itsranjan20033 ай бұрын
@@PradeepKanaparthy use expression after sequence generator and map the next val to targets
@PradeepKanaparthy3 ай бұрын
@@itsranjan2003 i connected seq_gen -> expresion -> tgt1 seq_gen -> expresion -> tgt2 src- > expr -> router -> tgt1 and tgt1 are connected. please suggest me how to achieve it.
@PradeepKanaparthy3 ай бұрын
sir, can you help me with sequence generator. i have mapping with three targets. i want to generate same sequence numbers for three targets. How do i achieve it. i have tried but couldn't get it.
@itsranjan20033 ай бұрын
@@PradeepKanaparthy add one expression transformation after sequence generator, map nextval to expression, from expression map the field to those three targets.
@muhammadsakr20273 ай бұрын
When I choose the target, the salesforce connection doesn't appear in the target's connection options, it only appears in the source. [I can only choose a flat file target] Can you help?
@manikantamani78494 ай бұрын
What's difference between variable field, output field
@mauriciomartinez92454 ай бұрын
Hi, I'm trying to run a mapping that updates to three different target tables. The issue is that the 3 have different columns, I just need to update 1 column on the 3 so it seems I can't use dynamic mapping tasks, what do you suggest?
@mdsaad9190-l2e4 ай бұрын
Hello sir, I would like to migrate multiple tables from terradata into s3 but while selecting source as a multiple i am not able to select all the tables it's only getting selected one table. I have a requirement of migrating multiple tables through only one mapping task . Can you please help me sir.
Hi Ranjan, I am trying to create a similar process. But my requirement is I will pass the parameter values while executing a taskflow. I am seeing the job always takes the values assigned in task (I have checked the run time override option).
@itsranjan20034 ай бұрын
Hi, have you provided the param file path and param file name in taskflow-->data task--> input fields. Pls check. It should work.
@rsmchittithalli73344 ай бұрын
All your vudeos are very simple and very very informative
@itsranjan20034 ай бұрын
@@rsmchittithalli7334 thanks
@atlanticoceanvoyagebird26304 ай бұрын
Dear Sir: My connection is similar to applied but it does not work. Please help.Thanks.
@itsranjan20034 ай бұрын
If you have created the service account correctly and downloaded the json file and provided the required details correctly while creating the connection , it should work fine. Seems you might have not provided the details correctly while creating the connection.
@rsmchittithalli73344 ай бұрын
Very informative
@rsmchittithalli73344 ай бұрын
Very nice and informative. Please take care of the background noise
@rsmchittithalli73344 ай бұрын
Very helpful
@itsranjan20034 ай бұрын
@@rsmchittithalli7334 thanks
@rsmchittithalli73344 ай бұрын
Very informative
@rsmchittithalli73344 ай бұрын
Very helpful sir
@herry22884 ай бұрын
Can you please share this in one pdf or word ?
@kirangoud69175 ай бұрын
Hi sir make videos iics informatica
@DEVNARAYANPANDIT-v8n5 ай бұрын
error: The DTM process terminated unexpectedly. Contact Informatica Global Customer Support. Can you please me on this?
@ShaileshNavghare-n6x5 ай бұрын
what if I have a source query of the type "select * from schema.table" and I want this schema to be paramterized? I have schema A for dev env and schema b for qat. I am trying to define input param with type as string and but it's not working
@itsranjan20035 ай бұрын
for schema, define parameter type as "connection" and for tables define parameter type as "object" while creating the input parameters.
@piyushmohanty69095 ай бұрын
Thank you for sharing . What benefits it is providing when we compare it with Dataloader ? How it is behaving in bulk number of records like 500k around ? Is this taking any batch count , like only x number of records can be inserted in one go ? All the automations like Triggers or Validation rules will work if i follow this process to update some records ? Suppose Dataloader is taking 5 hours to load 500k records with batch count 50(to avoid too many SOQL exeception keeping the respective trigger on) , Do you think This tool will work better in this scenario ?
@itsranjan20035 ай бұрын
yes, you need to define the batch size at target properties based on which the number of records will be inserted in one go. Regarding trigger and validation rules, informatica will work similar like data loader as these rules are affected only after the records are inserted into the target sfdc objects. If you don't want all fields to be updated as part of upsert functionalities, you have the option to insert records in one flow and update the object by mapping the required fields in another flow. Using Informatica, you have the option to transform the data( join multiple objects, filter records, aggregate data etc..) in a better way in comparison to data loader. Regarding the
@guddu110005 ай бұрын
Sir how to compare column and data type of source before executing mapping based on comparison results
@itsranjan20035 ай бұрын
to verify the column and data types you need to use the hard coded db/file connection and tables, then only you will be able to see the table column and data types. Pls refer my video 6.7
@guddu110005 ай бұрын
I saw that, it is related to parameterization. All I want to extract all the columns and their data type of source and compare with previous days same source structure Any function which can extract columns name and data type in comma separated list you know can be used
@kieees_76016 ай бұрын
Hi @itsranjan2003, while Parameterize the Source and Target Tables names , in the parameter file can we keep multiple source and target table names
@itsranjan20036 ай бұрын
@@kieees_7601 you can keep first parameter with first paramter value. 2nd paramter with 2nd paramter value etc..
@kieees_76016 ай бұрын
@@itsranjan2003 for example I am running Two mappings each has different source tables and different target tables and I can use one single parameter file ? In order to mention two different source tables and two different target tables from different mapping, the same single parameter file ? Like first_src_table= name First_tgt_table =name Above for first mapping. Second _src_table= name Second_tgt_table =name Above for second mapping?
@itsranjan20036 ай бұрын
@@kieees_7601 yes, you can use this..
@atlanticoceanvoyagebird26306 ай бұрын
Sir: I appreciate your teaching method which has easy understanding and implementations specifically for the beginners. Muhammad Musa Khan, North America.
@itsranjan20036 ай бұрын
Thank you.
@sohombosuchoudhury6 ай бұрын
Sound very low for this
@ashokkumar-fv6go6 ай бұрын
Hi ,when im uploading flate file the fields are showing in horizontal how to convert it into veritcal way please help me on this
@itsranjan20036 ай бұрын
pls check if you have selected the field delimiter properly,
@kieees_76016 ай бұрын
Thanks @itsranjan2003 , but want to check one thing . Scenario : I want to load 200 multiple source tables from MYsql DB to 200 multiple target tables on snowflake DB using just one single mapping and not do any transformations and run that mapping everyday. Just run that mapping everyday morning and do CDC on source data to target data. How is that possible,any idea?
@itsranjan20036 ай бұрын
please try with replication task , multiple source and target tables can be used in this task.(my video section 4.3 ).
@kieees_76016 ай бұрын
@@itsranjan2003 thank you
@kieees_76016 ай бұрын
@@itsranjan2003 is it also possible to have the source tables coming from different databases and different schemas within the databases and achieve that with single Replication task ? Or should we create separate replication task for database and schema specific? Note: But, target database and schema is constant for all the different sources databases
@itsranjan20036 ай бұрын
@@kieees_7601 I think it will allow tables from one database only in one replication task. for multiple databases and schemas, you may need to use multiple replication tasks.
@Rk_itz_meАй бұрын
@@kieees_7601 i think mass ingestion technique can be applied from IICS like from s3 to snowflake staging we need to run taskflow with help of mass ingestion . so it snowflake schema get load everyday basis .
@himanshu5976 ай бұрын
The voice is not there in this video
@itsranjan20036 ай бұрын
Yes, please refer the steps..
@himanshu5976 ай бұрын
@@itsranjan2003 Thank you for your efforts Ranjan!!
@ShashankGupta-eo3tk7 ай бұрын
Bhai awaj nhi h video mei
@jeffbauman75677 ай бұрын
from where we can get access key and secret key? and whats inside service account json file?
@itsranjan20037 ай бұрын
it will have project_id, private_key , client_email etc .. please refer the URL for generating service account developers.google.com/workspace/guides/create-credentials#service-account
@RiteshGatpalli7 ай бұрын
How to get the account name
@itsranjan20037 ай бұрын
after you login to snowflake, in left side, click on Admin-->Accounts. Then scroll the cursor to the account name , you will find one notification symbol, click on that , then you will see one URL, from the URL, copy the part which is there after "" and before ".snowflakecomputing.com". For example : if vabfghj-oiujk876767.snowflakecomputing.com is the URL, account will be vabfghj-oiujk876767
@RiteshGatpalli7 ай бұрын
@@itsranjan2003 Thanks !
@seeanj17 ай бұрын
Hi Ranjan, I need your help. Can I connect with you? It looks like you have good experience in Parameter and I need your expertise. Can pay for your service. I really appreciate it.
@TangM-p7p7 ай бұрын
How to use in operator in filter Transformation
@itsranjan20037 ай бұрын
Use OR operator..
@roshanmahapatra67658 ай бұрын
Nice explanation
@nishantchavan29808 ай бұрын
Nice video
@AbAnkra8 ай бұрын
what is your email to contact you
@PradeepKanaparthy8 ай бұрын
Hi sir, if we paramterize source tranformation with connection paramter, object parameter, then incoming field won't be there then how can we transform the data in this case if we want to change date format? is there any we can do both paratmerize and transform the data?
@itsranjan20038 ай бұрын
use hard coded connection/object, then update the field, then parameterize the connection / object.
@ShaileshNavghare-n6x7 ай бұрын
@@itsranjan2003 I still don't understand how this will work. When I am hardcoding and updating the fields everything seems fine at target field mapping but when I parametrize source, the target shows some fields unavailable for field mapping.
@itsranjan20037 ай бұрын
@@ShaileshNavghare-n6x pls watch my video in section 6.7
@PradeepKanaparthy8 ай бұрын
Hi sir, if we paramterize source tranformation with connection paramter, object parameter, then incoming field won't be there then how can we transform the data in this case if we want to change date format? is there any we can do both paratmerize and transform the data?
@itsranjan20038 ай бұрын
use the hard coded connection value, update the date format, validate it, then parameterize the connection/object again.
@PradeepKanaparthy8 ай бұрын
@@itsranjan2003 super sir. It worked! Thank you
@PradeepKanaparthy8 ай бұрын
Hi, if i want to add tranformation logic in parameterized mapping, how will we add?
@itsranjan20038 ай бұрын
use hard coded connection/object, then update the field, then parameterize the connection / object.
@PaulaCarvajal79 ай бұрын
Why you made two connections? src and tgt? What do they mean? What will this do?
@itsranjan20038 ай бұрын
Best practice is to create separate connections for source and target databases or file path.