Skip to main content

Inexpensive, Non-realtime Sites using Queues


A lot of the sites for small and medium businesses rely on dynamic engines (such as ASP.NET, PHP, etc) to generate most of the content at run-time. This kind of typical setup requires expensive hosting (relative to a small and medium size company) and (if not designed properly) may not perform very well. I would like to argue that most of those sites could be served with static HTML pages generated offline at different time intervals (either fixed or on-demand intervals, discussed later) and a Javascript Client front end that communicates with a slim server side service whose only job is to store commands from visitors into a queue. The queue can then be read and processed by non-hosted computers, freeing the business from having to pay extra hosting fees for application server.


The costs are reduced by the following:
  1. Static HTML sites mean less power required.
  2. Queue storages are usually cheaper services than full SQL Databases.
  3. A slim service that just relays the commands from the user to the queue and from the queue to the user should require less power than a full blown Web Application.

Static HTML sites do not require a high end server to achieve great performance. In fact, static HTML sites can take advantage of CDN and client caching. Many cloud services such as Google App Engine provide a generous quota for static content. To generate the static site, one could use many of the static site generators out there, one example is Jekyll.

A lot of user interaction in a small or medium business’ (SMB) public site do not require the ACID properties of SQL Databases. A simple Queue can reliably ensure storage of the visitor’s request for future processing, for example, a customer’s request to buy a product only needs to be stored in the Queue for the off-cloud Application Processor to retrieve it from the cloud’s queue and then do what it needs to do to process the sale. Cloud SQL databases can be expensive (e.g SQL Server on Azure), but queues such as GAE’s Queue are inexpensive and have quotas for free usage.

By having as little code running in the cloud, the cost of processing minutes can be saved. In the customer purchase example, the code to process the sale does not need to live in expensive clouds, it could live in an on-premises regular computer that checks the queue every so often. The computer that does the heavy processing can also be placed on another cheaper hosted environment if that is required by the business needs. The point is that your public facing site (the front line of your business) is as quick as possible and your back-end services (that do not require real time responses) can be placed in inexpensive environments (on-premises or a hosting provider).

Obviously, an application must be designed to make reliable Application Processors that can scale horizontally (i.e. more Application Processors can be added without impacting reliability). The back-end database to which the Application Processors connect can still be a SQL ACID database, but it does not need to be in the cloud anymore. In the customer purchase example, the sale and its details would go to its normal SQL Database after processed.

In the next posts, I will be linking to examples of sites with content that take advantage of content generated at deployment time. I will also linkt o examples using slim services that take advantage of cheap or free code hosting such as Google App Engine and Azure Websites.

Comments

Popular posts from this blog

Powershell script for converting JPG to TIFF

The following Powershell script will convert a batch of JPEG files to TIFF format: #This Code is released under MIT license [System.Reflection.Assembly]::LoadWithPartialName("System.Drawing") $files_folder = 'C:\path-where-your-jpg-files-are\' $pdfs = get-childitem $files_folder -recurse | where {$_.Extension -match "jpg"} foreach($pdf in $pdfs) { $picture = [System.Drawing.Bitmap]::FromFile( $pdf.FullName ) $tiff = $pdf.FullName.replace('.PDF','').replace('.pdf','').replace('.jpg','').replace('.JPG','') + '.tiff' $picture.Save($tiff) }

Power Automate: SFTP action "Test connection failed"

When I added an SFTP create file action to my Power Automate flow ( https://flow.microsoft.com ) , I got the following error in the action step, within the designer: "Test connection failed" To troubleshoot the Power Automate connection, I had to: go the Power Automate portal then "Data"->"Connections"  the sftp connection was there, I clicked on the ellipsis, and entered the connection info It turns out, that screen provides more details about the connection error. In my case, it was complaining that "SSH host key finger-print xxx format is not supported. It must be in 'MD5' format". I had provided the sha fingerprint that WinScp shows. Instead, I needed to use the MD5 version of the fingerprint. To get that, I had to run in command line (I was in a folder that had openssh in it): ssh -o FingerprintHash=md5 mysftpsite.com To get the fingerprint in MD5 format. I took the string (without the "MD5:" part of the string) and put

Alert if file missing using Powershell

The following Powershell script can be used to send an email alert when a file is missing from a folder or it is the same file from a previous check: $path_mask = "yourfile_*.txt" $previous_file_store = "lastfileread.txt" $script_name = "File Check" ###### Functions ########## Function EMailLog($subject, $message) {    $emailTo = "juanito@yourserver.com"    $emailFrom = "alert@yourserver.com"    $smtpserver="smtp.yourserver.com"       $smtp=new-object Net.Mail.SmtpClient($smtpServer)    $smtp.Send($emailFrom, $emailTo, $subject, $message) } Try {    #get files that match the mask    $curr_file = dir $path_mask |  select name    if ($curr_file.count -gt 0)    {        #file found        #check if the file is different from the previous file read        $previous_file = Get-Content $previous_file_store        $curr_file_name = $curr_file.Item(0).Name        if ($