User Guides

September 17th, 2018
Cloud Lending Solutions’ (CLS) Customer Success Engineering team is frequently contacted with questions on data storage, storage limits, and management strategies. This month I’m covering these topics starting with Salesforce’s Data Storage basics including how to check usage and available tools to clean up storage. I also provide a couple of tips to optimize storage specific to CLS components. It is important that all SFDC Administrators and Architects understand these Data Storage basics

Data Storage Limits: Salesforce uses a multitenant architecture, which necessitates per-org limits to ensure that all orgs perform consistently. Data storage is one of these limitations. Storage allocation depends on your User License and your edition of Salesforce:

How to analyze Data Storage Usage: You can analyze which objects consuming your Data and File Storage by going to:
Go to Setup-> Data management-> Storage Usages

Each record stored in Salesforce consumes 2 KB. Object records such as Leads, Contacts, Accounts, etc. are all 2 KB each. All CLS object records are also 2 KB each.

Non-record data consume varying amounts of storage:

  • Person Accounts are 4 KB
  • Campaigns are 8 KB
  • Articles are 4 KB
  • Email Messages storage is based on actual size of emails

Two methods to clean up unnecessary non-CLS data are:

  • Salesforce’s Mass Delete Wizard.
  • ETL (Extract, Transform, Loading) Tools, which are typically used for data migration but can also be used for Data removal.

Cloud Lending Solutions’ Data Storage: As expected the number of applications and loans that are originated and funded impact the storage usage. Additionally, the configuration of those loans such as payment frequency and number or terms on a contract will also increase needed storage.

Two major Data Storage consumers in CLS include Amortization schedules and Batch processing logs. Below are a few suggestions for optimizing data usage in regard to these processes.

1) Amortization Schedules (AMZ)

  • If an AMZ schedule is generated at the time of application creation, then archive or delete the AMZ schedules of applications that are converted to Loans or are rejected.
  • Ensure via custom settings that no AMZ schedule is generated for non-AMZ loans.
  • Develop a custom batch job to delete AMZ schedules for closed loans.

2) Batch Process logs:

  • a) Develop a custom batch job to delete Batch process logs those are stored for more than a certain number of months (e.g. 3 months).

Stay tuned for the January 2019 release. CLS is planning to provide hooks to enable integration with standard Data Archival modules.

 

August 20th, 2018
CLS’s Beta testing Pilot program:
During our past two releases viz., Lynx and Orion, CLS successfully partnered with customers and implementation partners to carry out testing of the Beta versions of the new releases before the GA releases. 2 customers and 1 implementation partner participated in Lynx Beta testing. That number increased to 3 customers and 2 implementation partners for Orion Beta Testing. Both the Beta testing initiatives were successful and helped the CLS product team to release a quality GA version of the new releases.

We are nearing the successful release of another version of CLS products. Its GA version is scheduled for release on 30th September 2018. The Beta version for the same will be ready a month before the actual release. We expect more customers to participate in Beta testing pilot of this release. If you want to participate, please contact your Cloud Lending representative.

Let me explain what CLS’s Beta Testing program is!

What is a Beta Test?

A test of our product(s) performed by real users in real environments.

  • It is typically the final test prior to shipping our product globally.
  • It is typically the final test prior to shipping our product globally.
  • It extends QA – never replaces it.

A month prior to the GA release date (every four months), our product team releases a Beta version. CLS’s CSE teams contact customers willing to test the Beta version and work with customers to test the Beta version. The issues found by customers during the beta testing are fixed by product team before the release of GA version, thus preventing any regressions and new bugs in the latest release.

What process is involved in the Beta Testing?

  • Identifying Target Customers who agree to the mutual benefits of Beta Testing and are willing to allocate resources to execute the test.
  • Deploying a CLS Product version to Target Customers before providing it to all other customers. This is done a month before the GA release date. Customers get a window of 2-3 weeks to do the UAT and provide feedback.
  • Gathering Feedback from those Customers.
  • Evaluating Feedback to obtain Manageable/Actionable Data.
  • Distributing & Integrating Data into appropriate CLS Departments to improve the quality of the release.

What are the benefits of Beta Testing?

  • Very cost effective – especially when compared to similar data gathering methods
  • Involves customers in CLS’s product at a critical stage in development, QA.
  • Provides real feedback from real customers.
  • Generates valuable data for all teams involved in product development.
  • Helps test the Post-install scripts against customer’s data.
  • Finds environment and customer specific information that is impossible to find in an internal QA environment.

 

July 16th, 2018
Loan Payment Batch Processing
Last month I wrote about general improvements in Batch processing. In this edition of Raj’s Corner, I’ll stay on the Batch Processing topic and provide tips specific to CLS’s most critical batch processing job – the Loan Payment Transactions (LPT) Creation Job.

The Loan Payment Transactions (LPT) Creation Job processes contracts that have automated payments configured to create and clear payment transaction records.  Most customers schedule this job at least once a day resulting in thousands of payment transactions created daily.  Given the critical nature of this batch job, fault-tolerance and on-time completion of this job is extremely important.  The following 3 configuration tips will assist in making sure that the process completely successfully.

  1. Defining Custom Batch Size for LPT Creation Job: Post Vela release, customers can define a different batch size than the batch size used for other batch jobs. This setting will help optimize the size of the batch to eliminate failure due to Salesforce limits. It will prevent errors such as “CPU time limit exceeded.” In order to utilize this setting, Customers are required to create a custom field to define the batch size specific to the LPT Creation job.  More details are available at in this Support Center article.
  2. Fault-Tolerance in Processing Loan Payment batches:  Prior to the Lynx release, one payment failure within a larger batch of payments would cause the all payments to fail. In the Lynx release, a feature has been added to ensure that failure in one record only results in the failure of that particular payment. The rest of the payments in the batch will continue to be processed.  If you want to enable this behavior, check the “Enable Fault Tolerant Batch Processing” flag in the “Org Parameters” Custom Settings.
  3. Splitting of LPT Creation and Clearing actions: When you are creating a large number of ACH files, it is useful to split the clearing of the payment from the payment creation.  This improves the time needed to create the LPTs since it is not dependent on the clearing process.   For larger volumes, it is recommended to schedule the LPT Clearing job every 2 to 4 hours in cases where borrower payments originate from various online payment modes. Scheduling the Clearing job multiple times in a day will minimize load on the LPT Clearing job which executes after the LPT creation job.

 

 

June 15th, 2018
Running Batch Jobs – Then and Now
From Raj – our Technical Architect, Services: Dear Customers – During my interactions with you, I often come across common questions and requests. I look forward to providing you tips and insights that will improve the operational efficiencies of your business! I’ll begin our series with information on the faster, more fault-tolerant Batch Jobs in CLS products.

Let’s first see what Batch Jobs mean from Salesforce’s perspective. You can write code in Salesforce’s Apex language to process a set of records asynchronously in batches. This is the best solution if you have a need to process a lot of records periodically (e.g. process all Active CL Contracts daily).

CLS Batch Jobs: CLS Batch Jobs execute a series of programs on an application or a contract without any manual intervention. These jobs are set up such that they can be run on predefined schedules or can be triggered an occurrence of specific events.
In CL Loan, the Start of Day (SOD), End of Day (EOD), and Loan Payments Processing are some of the key batch jobs that include sets of jobs for processing transactions, creating reports and performing tasks for starting and closing a financial business day. Some of the batch jobs included in the SOD and EOD jobs can also be run independently on an ad-hoc basis.

Over the course of various releases, CLS has made significant improvements in Batch Job processing in terms of speed, error logging, and making various jobs fault-tolerant. By upgrading to CLS’s latest release you will get the following benefits:

  1. Parallel Processing of SOD Jobs:
  • For customers with versions greater than 2.2003 (Vela release), I highly recommend using this feature. The customers who have implemented this feature have seen the SOD batch jobs run 40% to 70% faster.
  • CL Loan leverages Salesforce’s permissible limit for running concurrent SOD batch jobs. You can configure the org parameter ‘Concurrent Batch Jobs‘ to run up to five parallel instances of the SOD job.
  • The migration script ‘AssignThreadToLoanContracts’ assigns thread numbers ranging from 1 to the number set in the Concurrent Batch Jobs custom setting to all the existing contracts. This occurs when you upgrade to or install CL Loan 2.5000 or later version. For every new contract, a thread number is automatically assigned and is split for processing based on their thread number.
  1. Optimization in Batch processing: Post Hydra release, most of the batch jobs have been made intelligent to filter contracts that they would process based on certain statuses and field values. For example, the fees processing batch job is run only if “Active-Bad Standing” contracts exist for that day. Another example is Batch Jobs being run based on the loan product. For example, the job for broker payout must be run only for LOC contracts. Therefore, the job filters out contracts for other loan products and processes only LOC contracts.
  2. Immediate Start of Next Day: Pre Vela release, if any batch job in the SOD chain failed, the date would not have moved to the next day. This would have caused the business to halt causing a Severity 1 situation. Post Vela release, when the End of Day job finishes, it immediately updates the system date to the next day. This eliminates the possibility of the financial day being stuck at previous date due to the failure of any job chained in the SOD execution.
  3. Fault-Tolerance in Processing Loan Payment batches: Pre Lynx release, when multiple payments were processed in a batch if one payment failed the whole batch failed resulting in none of the payments in the batch being processed. It is required that a failure of one record does not result in all records in that batch to remain unprocessed. Post Lynx release, a feature has been added to ensure that failure in one record only results in the failure of that particular payment. The rest of the payments in the batch will continue to be processed. To enable this feature, the “Enable Fault Tolerant Batch Processing” flag must be checked in the ‘Org Parameters’ settings.
  4. Improvised Error Logging in SOD Jobs: Post Vela release, for critical SOD jobs, any event that causes a batch to fail is logged to allow for easy identification of the failing record, and its correction and reprocessing. For example, if the Billing job fails, the error log identifies the loan account and the bill that caused the job to fail. You can review all errors in the Batch Process Logs. Failures and errors are logged for EOD, SOD, Change Interest Rate, Interest Posting, Billing and Delinquency Processing batch jobs.

In addition to the above, there are several batch jobs that can be run as standalone. The CLS Support website has a library of knowledge notes that gives you specific instructions on:

  • Scripts to run the batch job in stand-alone mode in Developer’s console.
  • Parameter definitions for specific batch jobs.
  • Specific failure cases and how to generate error logs for them. For example, if a bill was due today and the BillingJob did not generate it.

I look forward to sharing additional insights in the next newsletter. 

For more information, ask a question or to request a demo, contact us.
Please Complete All Fields