Thursday, 17 October 2024

Heap Size and Apex CPU Time Limit

 

Heap Size -

1) Memory size for holding object , variables and records

2)




CPU Time limit - 

1) time consuming during whole transaction



 

Aspect

Heap Size

Apex CPU Time Limit

Definition

Maximum amount of memory allocated for storing objects and data during an Apex transaction.

Maximum amount of CPU processing time used by an Apex transaction.

Measurement

Measured in bytes (e.g., MB).

Measured in milliseconds (ms).

Typical Limits

6 MB for synchronous transactions, 12 MB for asynchronous transactions.

10,000 ms (10 seconds) for synchronous transactions, 60,000 ms (60 seconds) for asynchronous transactions.

Purpose

To manage memory usage and prevent excessive consumption that could affect performance and stability.

To manage CPU processing time and prevent long-running operations from degrading platform performance.

Scope

Affects how much data and how many objects can be stored in memory during a transaction.

Affects how long a transaction can use CPU resources for processing.

Common Causes of Issues

Large collections or objects, inefficient data handling, excessive memory usage.

Complex or inefficient code, lengthy calculations, large data processing tasks.

Resolution Strategies

Optimize memory usage by managing data efficiently, using smaller collections, and clearing unused objects.

Optimize algorithms, reduce complexity, use batch processing for large tasks.

Impact of Exceeding

Causes System.LimitException: Heap Size error; affects the ability to store and manage data in memory.

Causes System.LimitException: CPU Time Limit error; affects the ability to process code efficiently.

Example

Attempting to store a large list of records or creating large objects in a single transaction.

Running a complex loop or extensive calculations that consume too much processing time.

 

For CPU Time Optimization:

  • Focus on optimizing SOQL queries to reduce the amount of data processed and minimize execution time.
  • Optimize for-loops to avoid excessive iterations and nested loops that increase CPU usage.

For Heap Size Optimization:

  • Optimize SOQL queries to limit the volume of data retrieved and stored in memory.
  • Manage collections and avoid holding large amounts of data in memory for extended periods.

Identity Management in Salesforce

Heap Size issue in Apex

Integration Pattern in Salesforce

 

Pattern Approach
















Streaming API in salesforce

 

Streaming API enables streaming of events using push technology and provides a subscription mechanism for receiving events in near real time. 


The Streaming API subscription mechanism supports multiple types of events, including PushTopic events, generic events, platform events, and Change Data Capture events.


Note

Did you know? Pub/Sub API is a newer API that you can use to publish and subscribe to platform events and change data capture events. Based on gRPC API and HTTP/2, Pub/Sub API efficiently publishes and delivers binary event messages and supports multiple programming languages.



Design Pattern in Apex

 1) Singleton - example utlity class


2) Facade  - A simple and easy-to-understand example of the Facade pattern is a Computer System. When you press the power button on your computer, several subsystems (like the CPU, Memory, Hard Drive) need to work together to start the computer. Instead of interacting with these subsystems individually, you interact with a single, unified interface—the power button.


3) Strategy pattern  - Strategy pattern is a perfect example of runtime polymorphism, where different behaviors (strategies) can be swapped in and out dynamically, depending on the situation.


Factory Pattern - create interface and implements interface

Authorization ways for API

 

https://www.apexhours.com/salesforce-oauth-flows-guidelines-and-tips/

Inbound – you decide! Salesforce supports the following flows


OAuth 2.0 based different time of Authorization - 

  1. Web Server Authentication - when user logic through web app and UI is involved , eg - workbench

  2. User-Agent - used for desktop/mobile application eg - app/data loader/salesforce1

  3. JWT Bearer Token - Ideal for application which access sfdc only through API as there is no  UI involved. For example ETL tools or middleware.
  4. based to digital signature in JSON web token (JWT)
  5. No refresh token

  6. SAML Bearer Assertion
  7. User for SSO
  8. This flow also return only access token not refresh token

  1. SAML Assertion

  2. Username-Password

  3. Device Authentication
  • The OAuth 2.0 device authentication flow is typically used by applications on devices with limited input or display capabilities, such as TVs, appliances, or command-line applications

  1. Asset Token
  2. Refresh Token - 
  3. This flow also return only access token not refresh token

Named Credentials and Remote Site Settings

 𝐂𝐨𝐦𝐩𝐚𝐫𝐢𝐬𝐨𝐧 𝐛𝐞𝐭𝐰𝐞𝐞𝐧 𝐍𝐚𝐦𝐞𝐝 𝐂𝐫𝐞𝐝𝐞𝐧𝐭𝐢𝐚𝐥𝐬 𝐚𝐧𝐝 𝐑𝐞𝐦𝐨𝐭𝐞 𝐒𝐢𝐭𝐞 𝐒𝐞𝐭𝐭𝐢𝐧𝐠𝐬 



1) Named Credentials in Salesforce support a variety of authentication flows, including:

  • No Authentication
  • Password Authentication
  • OAuth 2.0 (Web Server Flow, User-Agent Flow, JWT Bearer Token Flow)
  • OAuth 2.0 SAML Bearer Assertion Flow
  • Client Certificate Authentication


🚀 𝐍𝐚𝐦𝐞𝐝 𝐂𝐫𝐞𝐝𝐞𝐧𝐭𝐢𝐚𝐥𝐬 :

▶️ Used for secure and easy authentication with external services, such as APIs.
Support various authentication methods, including Password Authentication, OAuth 2.0, and JWT (JSON Web Token).
 
▶️ Store credentials securely, separating the authentication details from the code. Passwords and tokens are stored in a dedicated credential record.
 
▶️ Ideal for API integrations where authentication is required. It abstracts and centralizes credential management.
 
▶️ Typically used for callouts made from Apex code, such as when making HTTP requests to external services.
 
▶️ Allows administrators to control access and permissions at a granular level, specifying which users or profiles can use specific Named Credentials.

🚀 𝐑𝐞𝐦𝐨𝐭𝐞 𝐒𝐢𝐭𝐞 𝐒𝐞𝐭𝐭𝐢𝐧𝐠𝐬 :

▶️ Used to enable communication from Salesforce to external websites over HTTPS. It helps Salesforce understand which external domains are safe to access.
 
▶️ Do not handle authentication directly. They focus on specifying trusted domains for making client-side requests
 
▶️ Do not store credentials. They are more about defining the allowed domains for client-side interactions.
 
▶️ Primarily used for client-side requests from Lightning components, Visualforce pages, or JavaScript remoting.
 
▶️ Appropriate for scenarios where the interaction is client-side, such as making AJAX requests or loading resources from external domains.
 
▶️ Do not provide granular access control for individual users or profiles. They are applied at the organization level.


🚀 𝐖𝐡𝐞𝐧 𝐓𝐨 𝐔𝐬𝐞 𝐍𝐚𝐦𝐞𝐝 𝐂𝐫𝐞𝐝𝐞𝐧𝐭𝐢𝐚𝐥𝐬 𝐚𝐧𝐝 𝐑𝐞𝐦𝐨𝐭𝐞 𝐒𝐢𝐭𝐞 𝐒𝐞𝐭𝐭𝐢𝐧𝐠𝐬 :

𝐍𝐚𝐦𝐞𝐝 𝐂𝐫𝐞𝐝𝐞𝐧𝐭𝐢𝐚𝐥𝐬 :

✅ You need to make server-side callouts.
✅ Authentication with an external service is required.
✅ You want to centralize and secure credential management.

𝐑𝐞𝐦𝐨𝐭𝐞 𝐒𝐢𝐭𝐞 𝐒𝐞𝐭𝐭𝐢𝐧𝐠𝐬 :

✅ You are making client-side requests.
✅ You need to enable communication with specific external domains.
✅ You are not dealing with server-side authentication but rather allowing client-side requests to external sites.



🤝 DM | Follow Jyothi SB to connect.
🚀 Let's connect and discover together.

Large Data Volumne in Salesforce

 Large Data Volume (LDV) in Salesforce refers to managing and working with large quantities of data, often exceeding typical operational limits. As organizations grow and data accumulates, handling large data volumes effectively becomes crucial to maintain performance, reliability, and user experience. Here’s a detailed overview of managing Large Data Volumes in Salesforce:

Challenges with Large Data Volumes

  1. Performance Impact

    • Query Performance: Queries on large datasets can become slow, affecting page load times and user experience.
    • Data Operations: Operations such as insert, update, and delete can be slower due to the volume of records being processed.
  2. Data Storage Limits

    • Storage Quotas: Salesforce imposes limits on the amount of data you can store. Exceeding these limits can lead to additional costs or require data management strategies.
  3. Search and Reporting Performance

    • Indexing: Searching and reporting on large datasets may suffer from performance issues if indexing is not properly managed.
  4. Batch Processing

    • Limits: Handling large data volumes often requires batch processing, which must be managed within Salesforce’s governor limits and batch size constraints.

Strategies for Managing Large Data Volumes

  1. Data Modeling and Indexing

    • Efficient Data Model: Design your data model to optimize performance. Use custom indexes for fields that are frequently queried or used in search conditions.
    • Indexing: Ensure that critical fields are indexed to improve query performance.
  2. Optimized Queries

    • Selective Queries: Use selective queries that filter data effectively. Avoid querying large datasets without appropriate filters.
    • SOQL Best Practices: Use LIMIT clauses, avoid SELECT *, and leverage relationship queries to retrieve only necessary data.
  3. Data Archiving and Purging

    • Archiving: Regularly archive old or unused data to external storage solutions. Salesforce’s data export tools and APIs can help with this process.
    • Purging: Implement data retention policies to delete obsolete records. Ensure that purging operations are compliant with data governance policies.
  4. Batch Processing

    • Batch Apex: Use Batch Apex to process large volumes of data asynchronously. Break down large operations into smaller chunks to stay within governor limits.
      apex
      public class LargeDataBatch implements Database.Batchable<sObject> { public Database.QueryLocator start(Database.BatchableContext bc) { return Database.getQueryLocator('SELECT Id FROM Account'); } public void execute(Database.BatchableContext bc, List<sObject> scope) { // Process records } public void finish(Database.BatchableContext bc) { // Post-processing } }
    • Queueable Apex: For smaller batch operations or to chain jobs, use Queueable Apex.
  5. Efficient Data Loading

    • Data Loader: Use Salesforce Data Loader or other ETL tools to handle bulk data operations efficiently. Use the Bulk API for processing large volumes of data.
    • Data Import Wizard: Suitable for smaller datasets or less complex data import scenarios.
  6. Search Optimization

    • Custom Search Indexes: Use custom indexes and optimize search queries to improve performance.
    • SOSL: Use Salesforce Object Search Language (SOSL) to perform efficient full-text searches.
  7. Data Management Tools

    • Third-Party Tools: Consider using third-party tools for data management, analysis, and visualization that are optimized for handling large datasets.
    • Salesforce Optimizer: Use Salesforce Optimizer to review and improve performance across various aspects of your Salesforce instance.
  8. Governance and Monitoring

    • Monitor Performance: Regularly monitor performance metrics and adjust strategies as needed.
    • Governor Limits: Be aware of and adhere to Salesforce governor limits to ensure smooth operation.

Salesforce Connect -

 


Data Virtualizations

A) Showing data from example oracle database to salesforce

External Object by default - Read Only but we can make writtable for CRUD

Steps for Salesforce connect -

1) Salesforce Data Source - configure salesforce data source setup for external ODAta 2.0 configurations

2) Enter authentication credentials

3) create external object


B) showing data from salesforce to other application


https://www.jitendrazaa.com/blog/salesforce/implementing-custom-apex-adapter-for-salesforce-connect/




Bulk API

 - Bulk api runs as REST Api

- bulk api count against API limit

- bulk api to load 300M records

- data loader has limit of 150M records

- bulk api using cURL to load data or use tools like app exchanges or data loader

- ideally to have batch size between 1000 to 10000

- montitor batch jobs

Friday, 6 September 2024

interview ques

 Gartner interview questions:

1- Write a Lwc component to fetch Account and it's related contact and opportunities and display on UI.
2- If we can use setup objects in screen flow?
3- Can we call future method from flow?
4- Can we use PS to control Page layout assignment?
5- explain muting PS
6- How can we use owner based Sharing setting?
7- Why are sobject allowed in queable but not future?
8- if out of box api's work for custom objects?
9- there is a cmp on record page, we want to display this cmp to only 5 user in my org from differ profiles. In future this number may rise to 10,15. What solution will you implement
10- parent to child, child to parent communication
11- when is wire method called in life cycle?
12- Write a soql query to fetch opportunity with max amount
13- while using wire method in lwc cmp, how does system comes to know that data has changed in backend?
14- all configuration required for performing integrations inbound and outbound?
15- In a lwc component I need to display records from a custom object. The record to be displayed in table and records are more that 50k. How can we display all records without using pagination?
16- Write a trigger to update user object information same as contact whenever there is a change or insert on contact information.
17- if we get mixed dml operation in flow transaction?
18- how can a manager's in a certain role hierarchy share records with his subordinates.
19- In an apex class denoted with with sharing, contains a soql on account object having a custom field abc__c. A user who is not having access to this field tries to run this code. What error would he get??
20- how can we control field level security in apex code?
21- I have class B and I want class B method sharing security needs to be defined during runtime ,means if class b is called from class A with sharing then it should run with with sharing or else from without sharing class then it should run as without sharing

Heap Size and Apex CPU Time Limit

  Heap Size - 1) Memory size for holding object , variables and records 2) CPU Time limit -  1) time consuming during whole transaction   ...