Image for a hint

On In the web, I would split all frontends into 2 big classes: SPAs and SSR websites. SPA stands for a Single Page Application. To put it simply: it is a index.html file that loads one large .js file. This .js file holds everything: all HTML pages of the website, styles, images, and so on. So when the user clicks a link inside of SPA, the browser actually does not perform an actual HTTP request because a page HTML is already loaded to memory from .js file. URL-bar of the browser is changed using JavaScript. To pass data between user and backend SPAs use compact REST API calls.

SSR stands for server-side rendering. Here when the user clicks the link, the browser makes a true HTTP request that loads the HTML of the page. So much more traffic is consumed from the server and user ends, plus servers always have to render HTML by wasting precious CPU time, when SPA uses the CPU time of the user's device.

A lot of people tend to think that SSR apps are legacy and come from the pre-SPA world. It is true - in 2k0x years, PHP technology "spawned" tones of CMSes like WordPress, Joomla, different BB forums, and so on. But nowadays SSR apps are still required. The thing is: search engines still don't work with SPAs well. Yes, Google already can execute JavaScript but all other search engines will not do it. Also, all social networks will not read OpenGraph tags from your SPA unless you apply some special crunchy workarounds. So commercial SEO is a waste of money up to this day if the site is done on SPA.

Big web portals commonly make hybrid apps: for logged out users (search engine crawlers) they show SSR versions so their blogs/landing pages can work great with SEO and SMM strategies and when the user is logged in they load an SPA. Oftenly SPAs are served on different URLs or even subdomains. However, sometimes it served on the same URLs with an SSR so the code of some middleware looks like this:

if request has header which indicates user is logged in:
then
serve index.html of SPA bundle
else
pass to some view to render HTML for this URL

This moves web portals closer to SMM strategies - if the user posts the URL of the SPA page to chats/social networks, then the underlying crawler will fetch URL and parse HTML and OpenGraph tags correctly. However, if the real user will go to the same URL and will log in there he will receive better seamless UX