Author: Jake Bauer <email@example.com>
Date: Sun, 25 Aug 2019 00:13:25 -0400
Update about-site to reflect new CSS capabilities
1 file changed, 12 insertions(+), 12 deletions(-)
diff --git a/pages/about-site.md b/pages/about-site.md
@@ -56,10 +56,10 @@ server to shut down to prevent data loss.
The pages themselves are written in the Markdown markup language with bits of
HTML sprinkled in as necessary. This is then compiled into HTML, has a header
-and footer stuck on to it, and is released as a fully-formed HTML webpage.
-Because of the ease of writing in Markdown and then being able to translate that
-into HTML, it just made sense to do it this way instead of manually writing all
-of the HTML.
+and footer stuck on to it, gets its title, meta tag, and CSS links set, and is
+released as a fully-formed HTML webpage. Because of the ease of writing in
+Markdown and then being able to translate that into HTML, it just made sense to
+do it this way instead of manually writing all of the HTML.
Of course I could use a static site generator or some other program but I enjoy
finding my own way to do things since it represents a fun challenge and is a way
@@ -70,14 +70,14 @@ possible for serving to clients. The service I use is linked below
<a href=#3></a> and is interacted with through their API from my webpage
-I am constantly tweaking and changing the script which compiles the webpages in
-order to achieve the best outcome with the least amount of manual effort. For
-example, one of my current hurdles is compiling specific pages with certain CSS
-files while leaving those files out of other pages. Right now, because there is
-only a small section of CSS for one specific page, all of the CSS gets put into
-the base.css file. Although this is technically inefficient as far as
-bandwidth usage, it doesn't make a noticeable difference since the amount of
-page-specific CSS is so tiny.
+Once the pages and necessary accompanying files are created and compiled, the
+files are uploaded to the website into a folder called `uploads/` using the
+`rsync -rR` command so that the parent folders are also copied along with the
+file. On the web server, a script runs which detects changes in this `uploads/`
+directory and copies the files it finds, maintaining the folder structure, into
+the website's directory.
+All of the scripts used can be found in the aforementioned git repository.