Technical architect plays an important role in designing solutions, establishing best practices, technical vision, guidelines and selecting/suggesting tools to add value to platform . In this 2 part series of post am putting together key points an architect should consider while designing solution on Salesforce platform. Here are few important points in my view
Salesforce version control and release management
Version control and release management are essential for project success and enabling continuous delivery. In an agile Environment where multiple development teams working on different projects and parallel to it ongoing production bug fixes makes it challenging to avoid conflicts between projects in the absence of right version control system.
Salesforce doesn’t support out of box solution for version control so its an important for an architect to establish right strategy in early stage using third party or onsite source control repository. Salesforce exposes most of its component as metadata which can be stored in version control systems like SVN or Github. Though there are few unsupported metadata which can’t be version controlled but most of frequently used metadata types are supported.
While version control tools give you the ability to manage software versioning through check-outs, commits, merges and check-ins, they can’t help you with the software build, promotion, and release process. it’s imperative to have a simple and reliable release process for getting your code onto your various environments. Again salesforce don’t have any release automation support which requires for an architect to establish right guideline for release management by using third party solution or tools like jenkins. Most organization has implemented version control and release management for other programming languages which can be utilized for salesforce. Release management tool like jenkins has connector to salesforce to deploy release package in various environments. if you already have jenkins implementation then you only need to add connector to it.
Salesforce data backup and restore
Data backup is key activity for implementing any enterprise system so for salesforce. Either you can use third party solutions or perform on premise data backup. The daily incremental back up is essential to restore data and prevent some losses in unexpected situations. Though backup might not bring every data to their original stage as their might be many records updated after last backup and it might required manual update.Though Salesforce performs replication of your data across data centres but any restoration done by them needs to be paid. Again data backup is also essential if there is enterprise data warehouse using the data for analytic.
Data backup can be performed full, incremental or partial using various type of API or any ETL tool.
Apex is quite similar to java programming language in syntax which can follow the patterns applicable to OOps programming languages. Definitely by using design patterns you want to enforce better organization , reusabilility and scalability. For an architect its important to create right code development practices so not every developer is following their own practices or naming conventions. While developing on force.com platform one needs to be aware of best practices of Salesforce platform in accordance with any governance limits. For example all DML must be bulkified to avoid any DML limits or SOQL should not be inside the for loop.
Salesforce is system of limits so before designing any solutions architect has to be aware of these limits. Salesforce has different types of governor limits which is applicable to the callouts, apex batch, per transaction, emails etc. Based on your Salesforce edition each feature has its own limit so better to review these feature limits as applicable.
There are various option to import/export data to and from Salesforce. Salesforce also provides dataloader to perform mass data import/export activities which meets few short comings of built in wizard. The dataloader which can be used if your needs are specific to performing DML operations or exporting data. If you need full scale solution to transform the data from or to different source of data then ETL which supports salesforce connector must be used. For example you need to load data into salesforce from oracle database where data from multiple tables needs to be first transformed and loaded into account object. This kind of complex transformation you can only perform via ETL tool. Also on top of it you are getting many benefits like fail over, job monitoring and notifications,scheduling out of box with any ETL tool.
Based on the data volume and requirements for data import/export an architect should select appropriate tool but i would highly recommend to use full fledge ETL tool instead of data loader.
Salesforce Transaction Security Policy
There are times when Set up audit trail is not enough to capture critical information related to data access or apply certain actions or notifications. As an architect many times you are asked to build solutions to meet audit or compliance requirement where system admin needs to get notified or perform an automated actions when a Salesforce user or profile accesses certain sensitive data. In another case lets say you are asked to restrict users to only 3 parallel sessions instead of limit of 5 provided by Salesforce. There are many challenges if these scenarios has to be developed from scratch and still won’t meet all needs. With transaction security policy you can monitor events like login, data export, an entity or certain resources being accessed then what notifications to send and any real time actions to perform.
This feature is not enabled by default with any edition. it requires an extra licensing cost but i think its worth to evaluate.
While working with integration if there is need to connect service via password authentication or Oauth then you must know this feature. It gives you the option to store credentials for callout URL using either password authentication, anonymous or OAuth. The other important aspect is that you don’t need to create remote site setting when you configure integration declared in Named Credentials. Though this feature is just configuration but its important to know as sometimes these settings are being done in custom setting where it can be done via Named Credentials.
Normally you will connect to service URL and supply authentication information from your apex code but if you are using Named Credentials then you will be referencing name of Named Credentials. You can also configure the root URL while creating new credentials and then you can append path in apex. In below example My_Example is name of named credential and after it has rest of path to service which gets added to root URL.
HttpRequest req = new HttpRequest();
Appending the URL useful when you want to invoke different URLs based on certain conditions in code and have same authentication settings.
That’s it for now in this post. I will be adding remaining points in next post. Let me know your feedback on what else you consider as important while designing solutions on Salesforce.