Leverage Big Objects to manage your data archival in Salesforce
Leverage Big Objects to manage your data archival in Salesforce
How to manage data archival using Salesforce Big Objects

When Salesforce was introduced in the year 1999, it was just utilized as a software solution to automate the sales process. In the last couple of decades, Salesforce has come a long from those humble beginnings and has become an enterprise-wide solution adopted by leading companies… Salesforce has expanded its product offerings to include Sales, Marketing, Service, CPQ, Billing, Analytics, Data, Community, Custom Solutions, and IoT.

If you are an organization that has been using Salesforce applications for a couple of years, you may have realized that data management in Salesforce has become complex, messy and has a ton of data that is not used anymore. Because of this, you may have already started running into Salesforce data storage limitations.

Given additional storage can cost up to $1500 for 500 MB per year so it may be an expensive affair if you have a large number of users and applications that generate data. Besides, redundant and unused data causes clutter for users and can lead to Governor limits getting breached. If you are at this stage already then you need an effective and potent data archival strategy for your Salesforce infrastructure.

Salesforce provides data archival capabilities in its application that can be used to archive the data and free up storage space.

managed services

Why Should You Archive Data in Salesforce?

  • Large data volumes can result in slower query performance, impacting user experience.
  • Cleaning up unwanted data reduced clutter for users and drives better adoption.
  • Archiving gives your organization greater control of your information processes.
  • Archiving production data reduces storage cost.
  • Archiving also keeps your data safe.

How to Plan for Data Archival in Salesforce?

Every organization using Salesforce needs an effective data archival strategy to manage storage in the long run. Before you begin with the process, do form a dynamic archiving policy and consider the inputs of all the stakeholders.

Storage and Limits

Keep a track of the storage you are currently using from the total storage available. Salesforce does provide extra storage, but it comes at a cost. Also, get a sense of how your data storage is growing with time so you can project when you may start exceeding limits.

Usage Trends

Review data usage metrics in Salesforce to identify which objects are responsible for the most data usage. Make use of suitable tools and dashboards to figure out which data needs to be archived.


You never know which data can come in handy in the future and therefore, it is necessary to consult with the legal team of your organization before deleting any of the data. Several data integrity implications such as Field Removal and Parent-Child Relationship might arise in the future; hence it’s always better to be on the safe side.

Data Archival Framework

  • Determine where to store the data and the frequency with which it may need to be accessed. Your IT department can help you with this task.
  • Bring the data owners and the legal team of the organization together and get the relevant information on the content of the data. Make use of the tools and dashboards to determine the necessary and irrelevant data.
  • Classify the data stored and define a set of rules to know which data to be stored, deleted, or archived off the system. The policy will help you classify storage locations based on ease and level of access needed.
  • Implement the data archiving policy.
  • Frame-up time-to-time review of the policy so that it stays apt as the organization and external influences change.

Data Archival Using Big Objects

What Are Big Objects?

Big Objects archive and manage large amounts of data within Salesforce without affecting its performance, and with processing scales of billions of records. Organizations can make use of standard or custom BigObjects to solve their large data issues.

The advantage of storing data in BigObjects in Salesforce is that data remains in Salesforce and it is queryable and easy to retrieve on demand.

How to Use Custom Big Objects?

Custom Big Objects are defined and deployed by using the Metadata API. These objects allow you to store unique data and information about the organization.

  • Define and Create Big Objects using Metadata API or using Setup.
  • Create the object file that contains definition, field, index, permission set, and package file.
  • Create the object file that contains definition, field, index, permission set, and package file.
  • Select ‘Active’ to activate the particular record type.
  • Change the deployment status to ‘Deployed’.

Key Consideration to Take into Account

  • Big objects support only object and field permissions.
  • You can’t edit or delete the index.
  • Big objects can be accessed from custom Salesforce Lightning and Visualforce components rather than standard UI elements (home pages, detail pages, list views, and so on).
  • You can create up to 100 big objects per org. The limits for big object fields are similar to the limits on custom objects and depend on your organization’s license type.
  • To support the scale of data in a big object, you can’t use triggers, flows, processes, and the Salesforce app.
  • You can access data in BigObjects by using Async or standard SOQL, depending on volume of data to query and the need for real time information.

Ready to save 85% of the storage costs and secure your Salesforce data, are you?

Salesforce does provide tons of features for data storing and archiving, but the best approach to take would depend on understanding your business needs. If you need expert advice or someone to do this for you, do reach out today Lister Technologies.

Outcomes Delivered