Loading
I've recently worked with a client here in Norway, that wanted to expand their business over to other countries. It was a business that was partially focused on e-commerce. I'll not go into any detail about the customer here, but I can describe a process of making sure their website and their solutions would be able to be translated from Norwegian to other languages.
The solution had to be able to translate both texts that are hosted in the frontend application, which was already using i18next, all the texts that are displayed through the backend which were hosted in different microservices in their respective SQL databases, and all the statically written outgoing correspondence: SMSes, Emails and later on notifications. The solution should be user-friendly and have support for an infinite amount of languages. The solution would be a service that would be called by other services or frontend to return data in the respective language.
I had completely free choice on how the frontend would be designed and I could use whichever framework I saw fit. This was a good opportunity for me to test out Blazor Server, as I've tested it on my own projects before and it seem like it would be able to save me lots of time. As for the backend, the only requirement was to use .NET.
The microservice would need to store a copy of the default data in the SQL database. We used React on frontend and i18next was generating a file with combined static texts. Those texts would be sent to the translation service when the application was deployed.
This is the process of initializing and copying the data coming from the frontend.
The second part is pretty simple, we will need a button that would create a new language table and copy over all "keys" with empty values. Since we have Blazor Server on the frontend we don't need to create API controller, but rather call a function directly.
As support of languages are set up we need to ensure all the new keys get added to the database, and all the modified texts get updated. This requires quite some logic since our post method becomes the upsert method. Statuses will be added for individual key entries: no changes, modified, and new. This takes some time to process since we have to iterate through all the default values, check if they're not modified, and do the same for all other languages. This only happens once the React application gets deployed, so there's no need for optimization just yet.
I'll not go into detail on how I've built the frontend but the structure of webpages follows the structure of the database: you choose the service and language and you get a table of key pair values of default translation, and the language of choice.
As part of the requirements was to never return an empty value, we decided on returning the default value where translation in the asked language does not exist. The way we determine the language that should be returned was through a country code token from a user, ergo if the user was logged into a Norwegian account it would return default, but if a user was logged in from another country then we would return this country's values, where they exist.
In order to not always make a query to the database, since the data is mostly static, we can implement response caching. I've used Redis caching in my solution, and it decreased the load time of the website since the data was served from Redis and refreshed only after 24 hours.
This one was a little bit more tricky than the front-end translation. First of all, I needed to make a non-invasive way of calling the translation service.
The solution to that was to create a method inside each microservice that would pass in texts in a function:
t(key, value)
where the key would be a static unique key and the value would be the value that needs to be translated. In this solution, I needed to also utilize Redis caching in order to reduce traffic that would be going to the translation service.
Although the problem itself seems pretty simple, there's a lot of complexity that can go into creating a solution, that has to fit into the current system and many times I'd question my own choices. This solution proved itself pretty fast and not too invasive, and overall I'm pretty happy with how it turned out.