Author Box


Discuss Your Project

About Us

We are Microsoft Gold partner with its presence across the United States and India. We are a dynamic and professional IT services provider that serves enterprises and startups, helping them meet the challenges of the global economy. We offer services in the area of CRM Consultation and implementation, Application development, Mobile application development, Web development & Offshore Development.

salesforce-logo-

What To Do If You Run Out of Data Storage with Salesforce and How To Prevent It

By Nitin Dangwal / August 14, 2016

May 27, 2020
What To Do If You Run Out of Data Storage with Salesforce and How To Prevent It

Salesforce - Data Archival Needs and Strategies

Salesforce being a cloud based multi-tenant application needs to closely monitor its resource usage. Be it server processing time, transactions or data storage, all resources need to be monitored for justified usage. One of the essential limits is data storage limit.

What happens if Data storage limit runs out?

Well, Salesforce just doesn't immediately stop users from inserting new data if data storage reaches 100%. Salesforce allows users to continue using their applications and create/ update data up to a grace limit (usually around 110%, but may vary). Once the grace limit is breached, Salesforce will not allow users to perform any create/ update operation and throw errors if users try to do so.

Benefits of Data Archival

As we have established that data archival is essential in Salesforce, let's understand what are benefits of implementing data archival for salesforce consultant California :

  1. Optimum data storage usage- optimum use of data storage provided by salesforce
  2. Reduced cost- as we can keep reusing data storage for more relevant data and keep non relevant data in backup locations
  3. Consistent application performance- with growing data the application can also become slow and nonresponsive. Event standard functionalities like listview, reports can start exhibiting degraded performance. With appropriate data archival processes, only relevant data will be persisted within Salesforce, hence enabling better performance of listviews, search, queries, reports
  4. Compliance– many industries are governed by various compliance rules (for e.g. Insurance, Finance etc.) and need to adhere to all the defined data retention guidelines/ policies

Planning for disaster is planning to avoid it

It is very evident that an appropriate data archival strategy needs to be designed and incorporated. Salesforce doesn't profess any specific data archival mechanism, but recommends designing an appropriate data purge mechanism.

Backup options

  1. BigObjects (Pilot):Salesforce has recently launched a new data storage mechanism called "BigObjects". It is bulk data storage option within Database layer of salesforce. As per forums, this is an HBase data storage platform which means it'll be able to handle huge loads of data (probably >100M records) and can be used for bulk data processing. However, at this stage there are limited details around this feature
  2. Cloud based archival platforms: with ever increasing demand for backup and archival needs, some major cloud players have provided data archival and backup platforms viz. Amazon GlacierGoogle Cloud storage(Nearline) etc.
  3. Cloud based archival services:some innovative players have started offering cloud based backup and archival services BackupiFy, SpanningOwnBackup Each product has myriads of features ranging from data comparison, snapshots, automatic backups, user based backups, one click restore etc.
  4. External On-Prem archival:often organizations may choose to utilize an in-house tool to manage backups and data archival (a database within local network)
  5. Scheduled backups: Now, this is included purely for informational purposes. Salesforce provides administrators ability to get backup of their Salesforce data (can choose specific salesforce objects). This can come handy for ad-hoc backup needs without using any third party tool. Once data backup processing is completed, Salesforce stores data backup files in a shared FTP location and emails FTP location to administrator
  6. Manual data backup via Salesforce: Administrators can manually download data from Salesforce using Apex dataloader or any other third-party tools
  7. Archival Custom object (least preferred):Create a custom object to store archived data. The only benefit this approach provides is ability to improve performance, as it keeps a check of data rows in the main object

Conclusion

In this way salesforce consultant can monitor the benefits of data archival thoroughly. Irrespective of what approach you finalize for your organization, identifying one is very important. Not planning for data archival in a cloud based SaaS app is like waiting for the disaster to happen. The biggest problem for data archival is that it is mostly underestimated. Once, data size becomes huge, it is quite difficult to ideate and implement an appropriate data archival strategy, even further problematic to test it.

[sc name="salesforce new"] [apss_share] [sc name="Nitin Dangwal"]

Salesforce – Data Archival Needs and Strategies

Salesforce being a cloud based multi-tenant application needs to closely monitor its resource usage. Be it server processing time, transactions or data storage, all resources need to be monitored for justified usage. One of the essential limits is data storage limit.

What happens if Data storage limit runs out?

Well, Salesforce just doesn’t immediately stop users from inserting new data if data storage reaches 100%. Salesforce allows users to continue using their applications and create/ update data up to a grace limit (usually around 110%, but may vary). Once the grace limit is breached, Salesforce will not allow users to perform any create/ update operation and throw errors if users try to do so.

Benefits of Data Archival

As we have established that data archival is essential in Salesforce, let’s understand what are benefits of implementing data archival for salesforce consultant California :

  1. Optimum data storage usage– optimum use of data storage provided by salesforce
  2. Reduced cost– as we can keep reusing data storage for more relevant data and keep non relevant data in backup locations
  3. Consistent application performance– with growing data the application can also become slow and nonresponsive. Event standard functionalities like listview, reports can start exhibiting degraded performance. With appropriate data archival processes, only relevant data will be persisted within Salesforce, hence enabling better performance of listviews, search, queries, reports
  4. Compliance– many industries are governed by various compliance rules (for e.g. Insurance, Finance etc.) and need to adhere to all the defined data retention guidelines/ policies

Planning for disaster is planning to avoid it

It is very evident that an appropriate data archival strategy needs to be designed and incorporated. Salesforce doesn’t profess any specific data archival mechanism, but recommends designing an appropriate data purge mechanism.

Backup options

  1. BigObjects (Pilot):Salesforce has recently launched a new data storage mechanism called “BigObjects”. It is bulk data storage option within Database layer of salesforce. As per forums, this is an HBase data storage platform which means it’ll be able to handle huge loads of data (probably >100M records) and can be used for bulk data processing. However, at this stage there are limited details around this feature
  2. Cloud based archival platforms: with ever increasing demand for backup and archival needs, some major cloud players have provided data archival and backup platforms viz. Amazon GlacierGoogle Cloud storage(Nearline) etc.
  3. Cloud based archival services:some innovative players have started offering cloud based backup and archival services BackupiFy, SpanningOwnBackup Each product has myriads of features ranging from data comparison, snapshots, automatic backups, user based backups, one click restore etc.
  4. External On-Prem archival:often organizations may choose to utilize an in-house tool to manage backups and data archival (a database within local network)
  5. Scheduled backups: Now, this is included purely for informational purposes. Salesforce provides administrators ability to get backup of their Salesforce data (can choose specific salesforce objects). This can come handy for ad-hoc backup needs without using any third party tool. Once data backup processing is completed, Salesforce stores data backup files in a shared FTP location and emails FTP location to administrator
  6. Manual data backup via Salesforce: Administrators can manually download data from Salesforce using Apex dataloader or any other third-party tools
  7. Archival Custom object (least preferred):Create a custom object to store archived data. The only benefit this approach provides is ability to improve performance, as it keeps a check of data rows in the main object

Conclusion

In this way salesforce consultant can monitor the benefits of data archival thoroughly. Irrespective of what approach you finalize for your organization, identifying one is very important. Not planning for data archival in a cloud based SaaS app is like waiting for the disaster to happen. The biggest problem for data archival is that it is mostly underestimated. Once, data size becomes huge, it is quite difficult to ideate and implement an appropriate data archival strategy, even further problematic to test it.

Cynoteck Salesforce Practice

Need to enrich your Salesforce CRM with business focused solutions? Contact our team to understand, how we can help you in achieving your CRM goals.


guest
1 Comment
Inline Feedbacks
View all comments
Gaurav
Gaurav
April 4, 2022 7:05 am

Hi, thanks for sharing a great article.

1
0
Would love your thoughts, please comment.x
()
x