Looming Threat — Your Salesforce database is creaking under the strain of too much data. Follow these six steps to grow past 100 million records:

In early 2017, Salesforce Client Success called Thanawalla Digital to review and correct the slow data access of one of their premier clients. The initial client call told us that that their database was growing exponentially, and would be past 100 million records before the end of the year. Their Salesforce processes were so cumbersome that they had resorted to moving sets of data off the Salesforce platform each night to alter the data before moving those (altered) data-subsets back into Salesforce: The most heinous “swivel chair” imaginable: excruciatingly slow and error prone.

At the time, they had about 35 million records. There was simply no way that they were going to successfully grow to 100 million records and beyond. One data-set turnaround was taking them almost a week, with queries relegated to running overnight in batch mode, exported, manipulated and then imported again. Their frustration was palpable. Salesforce invited us in because we had a history of successfully solving insidious data access problems within client orgs.

We met with the client team in early April to start sifting through their data architecture and work processes. Immediately, we recognized that they had data-duplication errors, broken indexes and an Org-health report that bled red on too many key metrics.

Over the next few weeks, we developed six concrete tactics to consolidate their workflows entirely inside of Salesforce, ultimately bringing their process down from days to minutes. Here are those tactics, explained during the 2017 Dreamforce Developer Zone by our Chief Technical Architect, Moyez Thanawalla.