Hello I am fresher to pega organization. But I learnd alot about pega admin from my relative who works as pega admin. Did I have any opportunities as a fresher?
@HarshaTrainingsacademy2 жыл бұрын
Hi hemanth...there will be less opportunities in the market... Good market as pega developer always... Harsha Trainings is one of the leading and Placement Consultant Provided training with qualified trainers on PEGA,DEVOPS, PowerBI,JAVA,DATA SCIENCE, MULESOFT ,Spring&Microservices,OracleFinance,Appian,Core&Advance java,Django,Communication&Varbal and other various leading software technologies…. For those who have gap after education, we have job placement assurance courses... Subscribe to our channel to learn best of best in technology... For training call us on 9652532753, 9885312299
@mdilipkumar51642 жыл бұрын
I can say proudly that im a product of u harsha ... ur a god for pega not only for ur students but also for subscribers ..
@HarshaTrainingsacademy2 жыл бұрын
Hi Dilip, thank you for ur complement... Pls refer of anyone looking for pega training...
@kirangem Жыл бұрын
Thanks a lot for creating this video and the scenarios' coverage. it is very helpful
@HarshaTrainingsacademy Жыл бұрын
Thankyou !! Keep Support Our Channel .....
@yesheshveupadhyay92502 жыл бұрын
Nice content well explained
@battulasatish25896 ай бұрын
Very useful 👌
@sushmatatikonda60312 жыл бұрын
Awesome 👍 video very very useful and Perfect explanation and good initiations..tqsm ..
@HarshaTrainingsacademy2 жыл бұрын
Hi Sushma, you are welcome... Pls do subscribe to the channel
@ram_c33 Жыл бұрын
Sir, for processing 10000 records , u said to use jobscheduler to do the job of queuing, but without a case how can we queue the items to queue processor?
@sushants1492 жыл бұрын
Hi Harsha, can u plz tell me that it is possible the fetch data from BLOB format via Obj-Browse. and what is difference between obj- browse and report definition in scenario of data fetching from BLOB format
@HarshaTrainingsacademy2 жыл бұрын
Hi yes we can fetch by using obj browse... Obj browse can't use blob properties in where conditions but report can use..m
@sushants1492 жыл бұрын
@@HarshaTrainingsacademy Thanks Harsha
@vineelagv70942 жыл бұрын
THANK YOU VERY MUCH SIR ..😀 GREAT JOB
@HarshaTrainingsacademy2 жыл бұрын
Most welcome
@sugurupavani88372 жыл бұрын
Could you please upload videos on Data flows and data sets?
@HarshaTrainingsacademy2 жыл бұрын
Sure we will do
@tanyasingh2294 Жыл бұрын
Hi @harshatrainings after fetching the records from the job scheduler can you please explain the way in which queue processor will be picking up those cases,I mean what will be the logic in the activity that we need to use.
@HarshaTrainingsacademy Жыл бұрын
Certainly, when working with a job scheduler and a queue processor to process records, the logic in the activity that picks up and processes these cases will depend on your specific use case and requirements. However, I can provide you with a general outline of the logic and steps involved in designing such a system: 1.Retrieve Records from the Job Scheduler: #The job scheduler is responsible for periodically fetching records or tasks that need to be processed. #These records can represent various jobs, tasks, or cases that require some form of automated processing. 2.Queue Processor Logic: #The queue processor is responsible for picking up these records from the job scheduler and processing them. 3.Processing Logic: #The processing logic within the activity will depend on the nature of the records and the tasks they represent. Here are some common steps and considerations: 3.1.Dequeue Records: #Retrieve records from the queue or job scheduler. #Depending on the queue system you are using (e.g., message queue, database table), dequeue the next available record. 3.2.Process Records: #Implement the specific processing logic required for each record or task. #This can involve data transformation, calculations, database operations, sending notifications, or any other required actions. 3.3.Error Handling: #Implement error handling mechanisms to deal with exceptions or failures during processing. #Decide on strategies for retrying failed tasks, logging errors, and reporting issues. 3.4.Completion and Cleanup: #After successful processing, mark the record as completed or remove it from the queue, depending on your system's requirements. #Perform any necessary cleanup tasks. 3.5.Logging and Monitoring: #Implement logging mechanisms to track the progress and status of each record. #Set up monitoring to ensure the queue processor is running as expected and to detect any issues promptly. 3.6.Concurrency and Scaling: #Depending on the volume of records and the processing time for each record, consider implementing concurrency and load balancing to process records efficiently. #Ensure your queue processor can scale horizontally to handle increased loads. 4.Repeat the Process: The queue processor typically runs continuously or at predefined intervals to pick up new records from the job scheduler and process them. 5.Exception Handling: Implement a strategy for handling exceptional situations, such as when there are no records to process or when the queue processor encounters errors. 6.Monitoring and Alerts: Set up monitoring and alerting mechanisms to be notified of any issues or delays in processing. 7.Testing and Validation: Thoroughly test the queue processor and processing logic with various types of records to ensure it behaves as expected. 8.Documentation: Document the entire process, including the design, configuration, and operation of the queue processor and activity.
@tanyasingh2294 Жыл бұрын
Thanks @HarshaTrainingsacademy
@rajeshbatchu3397 Жыл бұрын
when you say servers, these are nodes configured in job scheduler/queue processor?
@mavurisarika5953 Жыл бұрын
Hi Harsha, thanks for making these interview questions series which is helping us a lot. QQ, when the item queued by user runs on the context of user’s AG and item queued by manager runs on the context of manager’s AG, how is the processing going to differ for user and manager if both users belong to same application? How does the rule resolution of the processing activity differ, as both AG’s point to the same application?
@HarshaTrainingsacademy Жыл бұрын
Sorry for late Reply ......!!!!! In Pega, when items are queued for processing, the behavior can differ based on the user's access group (AG). Access groups are used to determine a user's access rights and privileges within the application. Different access groups can have different configurations, security settings, and rule resolution settings, which can affect the processing behavior for different users. Let's consider the scenario where both the user and manager belong to the same application, but they have different access groups. Here's how the processing can differ for the user and manager: 1.Rule Resolution: When a user submits an item for processing, the processing activity will be resolved based on the user's access group. The system will look for the activity rule in the rulesets available to the user's access group. Similarly, when a manager submits an item for processing, the processing activity will be resolved based on the manager's access group. The system will look for the activity rule in the rulesets available to the manager's access group. If the same activity rule exists in different rulesets accessible to the user and manager's access groups, the system will execute the version that corresponds to the respective access group. 2.Security Settings: Access groups can have different security settings, including data access permissions and privileges. This means that the same processing activity executed by the user and manager may interact with different data sources or may have different levels of access to data. For example, if the activity performs a database operation, the user's access group may have read-only access to certain tables, while the manager's access group may have read-write access to those tables. This can result in different outcomes when processing the same item. 3.Configuration Differences: Access groups can have different configurations for various components, such as user interfaces, flows, and assignments. This can affect how the processing activity interacts with the user and manager during its execution. For instance, if the activity involves displaying or interacting with a user interface, the user and manager may see different UI configurations based on their access groups' settings. Overall, the key difference in the processing behavior is due to the different rule resolution and the distinct configurations, permissions, and security settings associated with each user's access group. Even though both users belong to the same application, the processing behavior can vary based on their individual access group contexts..... Thank You !!!!!
@SanthoshKumar-4562 жыл бұрын
Thanks Harsha for uploading the interview questions and answers session. Plz make a video on integration.
@HarshaTrainingsacademy2 жыл бұрын
Sure we will do
@anilkumarreddy40532 жыл бұрын
Yes highly needed
@deepgaurav36482 жыл бұрын
Please Upload a scenario-based related question on case management.
@HarshaTrainingsacademy2 жыл бұрын
Sure..Next video is on Case management only
@sushants1492 жыл бұрын
Hi Harsha, in which version savable data page was introduced. is it 7 or 8 version ?
@HarshaTrainingsacademy2 жыл бұрын
Pega 7.2 onwards...
@sushants1492 жыл бұрын
Thanks Harsha
@sameermohammad76202 жыл бұрын
How the job scheduler processing is efficient than advanced agent...?? For queue processor we have Kafka but for job sheduler .??
@MrPraveensiva2 жыл бұрын
One of the major advantage in using the jobscheduler is we can see the execution data which is success and failure rates. But using the advanced agents it's hard to find how many instances were failed or succeeded. All this data has stored in the pr_perf_stats table. Rather than performance wise, monitoring the execution data is available in jobschedulers.
@HarshaTrainingsacademy2 жыл бұрын
Thank you Praveen taking initiative in answering the question... Appreciate your help
@sameermohammad76202 жыл бұрын
@@MrPraveensiva that's okay but how perform increased here...?? monitoring that's okay
@RSMScelebtalks2 жыл бұрын
Hi Harsha sir, I have interview question which I faced perform obj operations in datatransform ..
@HarshaTrainingsacademy2 жыл бұрын
No, we can not do that
@maridiraju21392 жыл бұрын
For 'Job Scheduler' agent schedule data instances won't create , then how can we disable multiple nodes if job scheduler runs on multiple nodes?
@HarshaTrainingsacademy2 жыл бұрын
Hi Raju, this can be done by using admin studio... Where we can stop the job scheduler or Queue processor... Harsha Trainings is one of the leading and Placement Consultant Provided training with qualified trainers on PEGA,DEVOPS, PowerBI,JAVA,DATA SCIENCE, MULESOFT ,Spring&Microservices,OracleFinance,Appian,Core&Advance java,Django,Communication&Varbal and other various leading software technologies…. For those who have gap after education, we have job placement assurance courses... Subscribe to our channel to learn best of best in technology... For training call us on 9652532753, 9885312299
@srinivasnaru34602 жыл бұрын
Integration meda oka interview pettandi
@HarshaTrainingsacademy2 жыл бұрын
Sure we will do
@anilkumarreddy40532 жыл бұрын
Yes
@ramaambarukhana64632 жыл бұрын
Hi Harsha ... In Queue processors How PRPC understand to map wo to specific Kafka msg Is there and command used, if so what is it
@HarshaTrainingsacademy2 жыл бұрын
Hi, There is no command to be used, While Queing we need to specify the Queue processor name... PRPC automatically created message data sets on KAFA-DATA folder for all the queue processors... Each queue processor look up Q messages here and pick then process those.
@jyotishkumarreddy32322 жыл бұрын
I have watched the video it is very use full to me. can you please make a video on log files and how to analyze the logs, Report definition, Drop down tables, Data pages, Data transformation SLA, intergration
@HarshaTrainingsacademy2 жыл бұрын
Sure will do... Keep watching, thank you again, we are going to do more similar videos on cracking interviews for freshers and experienced resources... Share this videos with your friends too... also please do not forget to subscribe...
@mavurisarika59532 жыл бұрын
@@HarshaTrainingsacademy I am working on Pega, and I used to follow blogs whenever I was needed to prepare for interviews. It is only few days back, I happen to come across this channel and videos of your F2F topic wise interview. Wish I had known about you channel earlier. I must admit, these videos are really helping me and have the advanced Q&As for which I couldn't find answers across multiple websites earlier. Glad that I have seen your Q&A series in which you have wonderfully covered the complex topics in simplest way possible. Thank you so much for your efforts and helping us all. Kindly please share such F2F topic wise Q&A session on other advanced concepts like Integration services and connectors, at the earliest possible please. Thank you.
@HarshaTrainingsacademy2 жыл бұрын
@@mavurisarika5953 Thank You Sarika, sure we will do.
@coolguypravara5 ай бұрын
Hi Harsha, Great videos on Pega Interview questions...Thank you so much. I have a question regarding the last one Advanced Agent to run in different time zones. As Pega is recommending a Job Scheduler how can we achieve the same requirement? We don't have data instances for the Job Scheduler..so how can we do this? Thank you.
@HarshaTrainingsacademy5 ай бұрын
Using Job Scheduler for Different Time Zones **Understanding the Challenge:** Pega’s Job Scheduler is designed to replace Advanced Agents for better performance and manageability. However, unlike Advanced Agents, Job Schedulers do not create data instances for each execution, which presents a challenge for running them in different time zones. **Solution:** You can achieve this requirement by creating multiple Job Schedulers, each configured to run at the desired local time for different time zones. Here's a step-by-step guide: 1. **Identify the Time Zones:** Determine the time zones in which you need to run the scheduled jobs. 2. **Create Job Schedulers:** For each time zone, create a separate Job Scheduler. Configure each one to run at the appropriate local time. - **Navigate to:** Records > SysAdmin > Job Scheduler - **Create New Job Scheduler:** Fill in the required details such as Name, Class, Run Rule, and Schedule. 3. **Schedule According to Local Time:** When setting up the schedule for each Job Scheduler, convert the local time to the server time zone. This ensures that the job runs at the correct local time. 4. **Implement Business Logic:** Ensure that the business logic within the scheduled jobs is aware of and respects the time zone differences. You can pass the time zone as a parameter if necessary or retrieve it dynamically within the job. 5. **Testing:** Thoroughly test each Job Scheduler to ensure it runs at the correct times for each time zone. Example Configuration Assume you have three regions: EST, PST, and GMT. 1. **Create Job Scheduler for EST:** - Name: `JobScheduler_EST` - Schedule: Run at 2:00 PM EST - Convert 2:00 PM EST to server time (e.g., if server is in GMT, run at 7:00 PM GMT) 2. **Create Job Scheduler for PST:** - Name: `JobScheduler_PST` - Schedule: Run at 2:00 PM PST - Convert 2:00 PM PST to server time (e.g., if server is in GMT, run at 10:00 PM GMT) 3. **Create Job Scheduler for GMT:** - Name: `JobScheduler_GMT` - Schedule: Run at 2:00 PM GMT By creating separate Job Schedulers for each time zone and scheduling them according to the server’s time zone, you can achieve the desired execution times. Conclusion While Job Schedulers in Pega do not have data instances like Advanced Agents, creating multiple schedulers configured for different time zones can fulfill your requirement. Ensure to test each scheduler thoroughly to confirm it runs at the correct local time. Feel free to reach out if you need further clarification or assistance. Best regards, Harsha Trainings.
@naagreddy Жыл бұрын
what is stream node in queue processor?
@HarshaTrainingsacademy Жыл бұрын
Hi, This is stream service, which enables and runs all queue processors in pega. Thank you so much for visiting our channel , Your comment will help us to provide better content for our subscribers👍. Please don't forget to Subscribe our channel, hit bell icon 🔔 for more Knowledgeable Updates. If you are looking for a fresher's Job | Gap after education looking for an IT Job | Looking for a Good Package Job in IT then Harsha Trainings is Right Place. 🤙Talk to our Experts to get Complete Details : +91-9652532753 | +91-9885312299 | +91-8121092753 We offer below course with 100% placement assistance 1. PEGA 2. DEVOPS 3. JAVA 4. PYTHON 5. APPIAN 6. SALESFORCE 7. C Language 8 . Manual and Selenium Automation Testing.
@ashirbadparida326811 ай бұрын
Suppose we created a new database table under dev ....How to move product file from DEV to QA environment or higher environment
@HarshaTrainingsacademy11 ай бұрын
In Pega, moving database changes, including the creation of a new table, from a development (DEV) environment to a quality assurance (QA) environment or a higher environment involves a combination of database and Pega-specific steps. Below is a general guide on how you can achieve this in a Pega environment: ### 1. **Backup the Database:** - Before making any changes, it's essential to take a backup of the database in the source environment (DEV). This ensures that you have a snapshot of the database before any modifications. ### 2. **Update Pega Rules:** - In Pega, data tables are represented as classes. After creating the new database table in the DEV environment, you need to create corresponding Pega classes that map to this table. Use the Pega Developer Studio to create these classes. ### 3. **Generate Database Scripts from Pega:** - Pega provides tools to generate database scripts based on the changes you made in the Pega rules. Use the "Generate Schema" feature in Pega Developer Studio to create SQL scripts that represent the changes needed in the database. ### 4. **Quality Assurance (QA) Database Preparation:** - Ensure that the QA database schema is aligned with the DEV environment. This may involve creating any necessary schemas, users, or configurations required for the new table. ### 5. **Execute Database Scripts in QA:** - Execute the SQL scripts generated from Pega in the QA environment. This will create the new table along with any associated structures. ### 6. **Test Data Migration (if applicable):** - If the table in DEV contains data that needs to be moved to QA, consider using Pega tools or database scripts to migrate the data. This step is crucial to ensure that QA has a representative dataset. ### 7. **Testing in Pega:** - Thoroughly test the functionality and integration of the new table in the QA environment using Pega Developer Studio. Ensure that Pega can interact with the new table as expected. ### 8. **Update Rule Instances:** - If any Pega rules reference the new table or its properties, make sure to update those rule instances to reflect the changes made in the QA environment. ### 9. **Deployment to Higher Environments (if applicable):** - If there are additional environments beyond QA, repeat the steps for each subsequent environment, ensuring that each environment is properly prepared and tested. ### 10. **Documentation:** - Update documentation in Pega to reflect the changes made, including any modifications to the Pega rules. This is crucial for maintaining a clear record of the changes made and aiding future development and troubleshooting. ### 11. **Version Control:** - If your Pega application is part of a larger software project, ensure that your Pega rules are appropriately version controlled to track changes over time. ### 12. **Communication:** - Ensure proper communication with all stakeholders involved in the migration process. Notify the relevant teams and individuals about the changes made and provide any necessary documentation.
@naturalworld10022 жыл бұрын
when multinode environment where need to mention time zones in agent rule form??
@HarshaTrainingsacademy2 жыл бұрын
Hi, we need to mention the time in the agent schedule instance...
@naturalworld10022 жыл бұрын
@@HarshaTrainingsacademy thanks for your reply
@sushants1492 жыл бұрын
Hi Harsha, can u pls tell How to call activity in Data transform
@HarshaTrainingsacademy2 жыл бұрын
Hi , use pxcallactivity function
@sushants1492 жыл бұрын
Thanks Harsha
@avinashganti6062 жыл бұрын
Req 1: For teamleaders managers worklist shouldn’t be shown. Req2 : for managers all the team members worklist should be displayed how can we do this ?
@HarshaTrainingsacademy2 жыл бұрын
Hi Avinash... 1. We can do it by using previlege 2. This one also can be done by using privilege..
@suribabu46562 жыл бұрын
We have a case management rule alredy exits but now we doesnt need it how we delete it.
@HarshaTrainingsacademy2 жыл бұрын
We need to delete all case type rules (what ever the rules created during case creation) manually and go to case type explorer and remove the case type..
@ramaambarukhana64632 жыл бұрын
How is AQM is created and how is agent knows to process the particular AQM ?
@HarshaTrainingsacademy2 жыл бұрын
Hi AQM is an internal deisgn of PEGA where Q is a table PR_SYS_QUEUES... Standard agent at scedule time will wake up and look up Q table.. It will process the itemas which are matching by agent name and status scheduled... Then agent picks and process the item
@rubinashaik6532 жыл бұрын
how to call queue processor from job scheduler?
@HarshaTrainingsacademy2 жыл бұрын
Use method queue-for-message in the job scheduler activity and call queue processor there... It will work
@rubinashaik6532 жыл бұрын
@@HarshaTrainingsacademy may I know How to call flow from activity? Thanks in advance
@HarshaTrainingsacademy2 жыл бұрын
@@rubinashaik653 hi we can call flow from activity by calling OOTB activity like Call New Call newfromflow
@rubinashaik6532 жыл бұрын
@@HarshaTrainingsacademy thank u so much
@HarshaTrainingsacademy2 жыл бұрын
Welcome
@paramanranjith85922 жыл бұрын
Hi Madam and Sir... your doing excellent work and it is helpful to all... I need your help.. almost I have attend 10 company and almost I covered 75% answers in the interviews but few questions I am not aware of answers so please help on this and if you want I will pay amount as well ... please help on this..
@HarshaTrainingsacademy2 жыл бұрын
Hi Ranjith....Not needed any pay Please reach us on 9652532753 we will guide u to the next level of success ..
@sriram23662 жыл бұрын
Hi Ranjith would you please share the interview questions, it will help us
@sairamgoud48132 жыл бұрын
Please attach document of script with questions and answers
@HarshaTrainingsacademy2 жыл бұрын
Hi Sairam sure... I will be doing soon .. tq
@suribabu46562 жыл бұрын
In a flow rule how we route assignment to operator and if the operator doent complete it in 5mins how we move or end flow.
@HarshaTrainingsacademy2 жыл бұрын
Hi Suri Babu, thank you for asking this questions. This is one of the interview questions that we come across regularly... We can impleent this by two ways 1. creating SLA where u mention deadline time as 5 minutes and call an escalation activity. Int he esclation activity call OOTB activity pxChangeStage and mention the name of last stage. 2. . creating SLA where u mention deadline time as 5 minutes and call an escalation activity. Int he esclation activity call OOTB activity ForceCaseClose and pass current case ID. Hope this answers your query clearly.... Please subscribe to the channel
@venkataramanjaneyuluketha74102 жыл бұрын
please providing Pdf Sir
@HarshaTrainingsacademy2 жыл бұрын
Hi Venkata, I need some permission to do that...
@anilkumarreddy40532 жыл бұрын
Please cover Integrations topics comprehensively as questions on this topics are not much available.
@HarshaTrainingsacademy2 жыл бұрын
Sure... Will make a video on this...
@sushants1492 жыл бұрын
Hi Harsha, Can we keep more than one subprocess shape in main flow? plz confirm it StartShape------->SubprocessShape-------->AssignmentShape------>SubprocessShape------>AssignmentShape----->EndShape