Category Archives: Computer Science

Preparing to Migrate a WordPress Site

Now that I’ve got a functional website built for the MassHOSA project, it is time to start preparing to move it the website to its permanent home. Development has been straightforward partly because it has taken place while the server has been living on my personal virtual private server. With full SSH access to the development server, it was much easier to make server-side tweaks to various environment settings. Many of these tweaks had more to do with my server being misconfigured than with WordPress, however. I am hopeful that the permanent hosting environment that is selected will require minimal modifications. Many of the hosts that we’ve looked at, for example, have environments tailored specifically for WordPress hosting.

To begin preparing, I copied the entire WordPress directory to my local machine using SCP. While this took some time, I wanted to be sure that everything was transferred and remained intact. I did not necessarily trust that FTP was up to the task, as I have had some problems with file integrity after using FTP for large-scale file transfers. While there may have been many other contributing factors, I thought I would try SCP instead this time, at least for the downloading of the website files to my local computer. FTP may be the only option for uploading the files to the new host, as many shared hosts do not allow SSH access.

The next step of the preparation process was to export and download the contents of the database associated with the installation. Choosing how I export the tables is important, because of the limited privileges that may be available for importing the data on the new host. To ensure that I would be able to import the tables on the new host, I used the account used by WordPress to access the database, and exported all of the tables in the database. This way, even if the new host allows only one database, I will be able to migrate all of the necessary tables and simply update the wp-config file to point to the correct database.

Thankfully, if anything goes wrong during the setup of the site on the new host, I have the working installation on my virtual server to fall back on while working things out. I hope that I have not overlooked anything and that the migration will be straightforward and painless.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Looking Back on Sprint 4

I can’t believe that we’ve already completed four sprints! I am happy with the progress and minor breakthrough that we made during this sprint. It took a good deal of researching, trial and error, and investigation to gain the base of knowledge required to implement the rudimentary offline login authorization that I pushed to the team’s repository this sprint. I’m hoping that getting some code pushed up to the repository will be an impetus for driving renewed energy in sprint planning and the next sprint. Other things that we got done this sprint were important for ensuring that we remain on track to integrate our work with the work of the other teams.

Some of the tasks that we completed during this sprint are no longer applicable because of how we have decided to move forward in development. One task that is no longer applicable is the “Locate server code” task. Locating the server code is no longer applicable because we have implemented the offline authentication in a way mimics a response from the server rather than mocking the server itself.

Another task that may or may not be applicable are the “Install PouchDb on AMPATH app” and “Create ‘Mock’ of PouchDB (dependent on PouchDB investigation)” tasks. These tasks were certainly worthwhile even if we do not proceed with implementing the storage of credentials using PouchDB because the offline data storage team appears to be using PouchDB. We will likely move forward with storing the login credentials using localStorage because there is far less overhead and services already written into the ng2-amrs application.

While we attempted the “Contact APMATH team to determine if we are taking the correct approach” task, we have not yet heard back.

One of my assigned tasks for this sprint was “Store user credentials in localStorage so the user can login offline.” I managed to store both the “auth.Credentials” and “user” objects in localStorage once the user successfully logged in online. This user information is then later extracted from localStorage when the user attempts to login offline and used to create a session that authenticates the user.

Tasks that we will be carrying over to future sprints include “Create back end design of new UI using Balsamiq,” “Collaborate with ‘Everyone Else’,” and “Collaborate with ‘Field Idiots’ (Sprint to Sprint).” These tasks will be carried over because they’ll never really be completed, they are ongoing.

Although many of the tasks for this sprint may end up being no longer applicable moving forward, the team seems to have been productive during this sprint. In planning for the next sprint, I think it will be much easier to assign discrete tasks now that we have a basic implementation on which we can build functionality and add features. I’m hopeful that we will soon be knocking off todo items and have a fully functioning offline authentication in no time.

 

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Using Breakable Toys

I am a strong believer in not being afraid to fail. Failure is how we learn and improve. If you are not facing failure, then perhaps you are not pushing yourself hard enough. It is difficult to grow as an individual and certainly as a software developer if you are not pushing your personal limits. Pushing the limits of what you are comfortable with will inevitably mean failure at one point or another. It is important to know that failure is perfectly acceptable, and learning from your failures helps you to grow.

I was relieved to learn that the ideas presented in Hoover and Oshineye’s Apprenticeship Patterns aligned so well with my personal thinking. The Breakable Toys pattern specifically mentions not being afraid to fail, and gives advice on creating a safe environment to try things. Because it would be dangerous and risky to do your experimenting at work, Hoover and Oshineye recommend creating a safe space. What you create in your safe space should be relevant to your work as an apprentice, and similar in toolset but smaller in scope.

While I have made quite a few of my own programs just messing around to gain familiarity with a particular subject or idea, the specific nature of these programs makes them lose relevance quickly. They are abandoned shortly after they serve their purpose of familiarizing me with an idea. I like the solution presented by Hoover and Oshineye to create software such as a wiki, game, blog, or IRC client. These types of software will not lose their relevancy, as they can be continuously used and further developed. New features can be added that not only serve practical uses, but allow for new opportunities for learning and practice.

Creating software for personal use is far less risky than playing around on company time. If you do it right, you may even get something useful in addition to the knowledge you gain from your failures. I am looking forward to attempting to develop one of the suggested tools suggested in the Breakable Toys pattern. While I do not think I will be writing any software that lives up to Torvald’s breakable toy, I can certainly appreciate the value in trying, and also in failing.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Using Git with WordPress

As part of my continued efforts to not lose all of my hard work, I’m implementing tools to help me track changes and have decided to use version control to do it. I’ve chosen to use Git because of my relative familiarity with the tool.

For a bit of background, my web server is running Ubuntu 16.04.3 LTS and the latest version of WordPress at the time of this writing, version 4.9.4. Because GitLab allows for free private repositories and the nature of the project makes a public repository undesirable, it was chosen over GitHub. One thing to note about this setup is that I have full shell access to the server, allowing me to install programs and edit properties as necessary to get things setup. When the website is eventually migrated to its permanent hosting location, some changes may be necessary to the following setup to accommodate the server implementation. Many shared hosting providers do not allow shell access, and a new strategy would need to be considered in this case.

I started the setup by performing a bit of housekeeping with

sudo apt-get update

and then performed the initial Git installation with

sudo apt-get install git

I then performed the usual Git setup, uploading my SSH user’s key to GitLab and setting my username/email with

git config –global user.name “Your Name

git config –global user.email “youremail@domain.com

After cd’ing to the directory of the website files, I issued the command

git remote add origin git@gitlab.com:MassHOSA/masshosa-website.git

An important step here is to make sure that no sensitive files are tracked by Git. I did this by adding a .gitignore with the following:

#————————

#  Main ignored items

#————————

/../wp-config.php

/wp-config.php

.maintenance

versionpress.maintenance

/.htaccess

/web.config

/wp-content/*

!/wp-content/db.php

!/wp-content/index.php

!/wp-content/plugins/

/wp-content/plugins/versionpress/

!/wp-content/mu-plugins/

!/wp-content/themes/

!/wp-content/languages/

!/wp-content/uploads/

!/wp-content/vpdb/

#————————

#  Log files

#————————

*.log

error_log

access_log

#————————

#  OS Files

#————————

.DS_Store

.DS_Store?

._*

.Spotlight-V100

.Trashes

ehthumbs.db

*[Tt]humbs.db

*.Trashes

at this point it was safe to issue a

git add .

and commit with

git commit -m “Initial commit”

and finally push changes with

git push –set-upstream origin master

And that’s all there was to it. I’m now tracking all of the changes that I’m making to theme and plugin files. These are the only files that I really care about reverting and recovering changes that I’ve made. Everything else is backed up regularly using Updraft.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Using Git with WordPress

As part of my continued efforts to not lose all of my hard work, I’m implementing tools to help me track changes and have decided to use version control to do it. I’ve chosen to use Git because of my relative familiarity with the tool.

For a bit of background, my web server is running Ubuntu 16.04.3 LTS and the latest version of WordPress at the time of this writing, version 4.9.4. Because GitLab allows for free private repositories and the nature of the project makes a public repository undesirable, it was chosen over GitHub. One thing to note about this setup is that I have full shell access to the server, allowing me to install programs and edit properties as necessary to get things setup. When the website is eventually migrated to its permanent hosting location, some changes may be necessary to the following setup to accommodate the server implementation. Many shared hosting providers do not allow shell access, and a new strategy would need to be considered in this case.

I started the setup by performing a bit of housekeeping with

sudo apt-get update

and then performed the initial Git installation with

sudo apt-get install git

I then performed the usual Git setup, uploading my SSH user’s key to GitLab and setting my username/email with

git config –global user.name “Your Name

git config –global user.email “youremail@domain.com

After cd’ing to the directory of the website files, I issued the command

git remote add origin git@gitlab.com:MassHOSA/masshosa-website.git

An important step here is to make sure that no sensitive files are tracked by Git. I did this by adding a .gitignore with the following:

#————————

#  Main ignored items

#————————

/../wp-config.php

/wp-config.php

.maintenance

versionpress.maintenance

/.htaccess

/web.config

/wp-content/*

!/wp-content/db.php

!/wp-content/index.php

!/wp-content/plugins/

/wp-content/plugins/versionpress/

!/wp-content/mu-plugins/

!/wp-content/themes/

!/wp-content/languages/

!/wp-content/uploads/

!/wp-content/vpdb/

#————————

#  Log files

#————————

*.log

error_log

access_log

#————————

#  OS Files

#————————

.DS_Store

.DS_Store?

._*

.Spotlight-V100

.Trashes

ehthumbs.db

*[Tt]humbs.db

*.Trashes

at this point it was safe to issue a

git add .

and commit with

git commit -m “Initial commit”

and finally push changes with

git push –set-upstream origin master

And that’s all there was to it. I’m now tracking all of the changes that I’m making to theme and plugin files. These are the only files that I really care about reverting and recovering changes that I’ve made. Everything else is backed up regularly using Updraft.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Backing Up

In light of what happened last week, I decided to make coming up with a strategy for backup a priority on the Massachusetts HOSA website. After all of the work that I put into the design the first and second time around, I do not want to risk losing it again. In many regards, I am thankful that losing the data happened when it did. It has allowed me to improve my habits and develop in a more efficient and sustainable way.

The first thing that I made sure to do once I had restored the design of the website was to make an initial back up of the files and database. Although there are more efficient ways (I’ll explore some of these later), I chose what was easiest at the time and copied the files from the web server to my local hard drive through an FTP client. Through an SSH session, I dumped the contents of the database to a .sql file and transferred this file to my local computer, again through FTP. I am now far less paranoid making changes, because I know that I have this backup to fallback on should I mess anything up beyond repair. This backup contains the entire base site, with all design completed per original specifications.

After meeting to discuss the next steps, I will undoubtedly be making more changes to the site. Rather than having to initiate these backups manually each time using the process I described above, I would like to have some way of automatically backing up changes on some sort of a regular schedule. After researching plugins that could accomplish this, I found UpdraftPlus. I wanted to use a plugin rather than something server-side because we will be migrating the WordPress installation to a different server following development. By using a plugin rather than some sort of cron job or script on the server I would eliminate the need to completely reconfigure the backup service after the migration.

After initial setup, I ran a forced backup using the UpdraftPlus plugin. Despite a few files that the tool was unable to backup due to incorrect file permissions, the backup ran smoothly and stored all of the pertinent website data, including a database backup, on my Google Drive account. The only thing that has to be done at this point is to transfer the backup location to someone who will be able to access them if needed following development. I’m very happy to have found a solution to the problem of backing up, and looking forward to not worrying about breaking the website.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

The Beginnings of my Map

I’m typically not much of a long-term planner. While I have defined some goals and aspirations, they sometimes seem more like rambling dreams than achievable objectives. Although I may know what I want, defining actionable items for getting there is a different and far more challenging accomplishment. I like to make the most of every day, doing the best that I can and hoping that hard work and positivity will take me somewhere amazing, wherever that may be.

Much like the story told by Desi in the Draw Your Own Map pattern in Hoover and Oshineye’s Apprenticeship Patterns, I have begun my Information Technology career in a sort of hybrid role in Systems/Application Administration. The differences between my experiences so far, thankfully, have been far different from those described by Desi. Rather than facing pressure to concentrate only on furthering the interests of others, I am encouraged to create a valuable experience for myself.

While I am well aware that it is not the responsibility of anyone else, employer, professor, or colleague, to give me a hand out, I am extremely grateful for all of the support that I have received in that regard. I feel as though I have a good sense of my career position  and potential options for the future because of the support that I’ve received and the experiences that I’ve had.

Although I see absolutely no need to consider other paths at this time, the Draw Your Own Map and the stories shared by Desi and Chris were somewhat reassuring. It is comforting to know that even if there comes a time when it may feel like it, you are never really stuck. I feel extremely grateful to currently be in a position where my goals and desires for professional advancement are not only heard and considered, but seem to be top priorities. I am in a supportive, flexible environment, where I’m encouraged to set and work to achieve goals for personal and professional growth, development, and learning. Right now, the possibilities and prospects seem to be wide open. The next step for me is to use this opportunity to discover the values that are important to me.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Looking Back on Sprint 3

During the third sprint, we began digging into the code of the ng2-amrs application and really started to attempt to gain an understanding of the existing implementation. There were quite a few hurdles throughout this sprint, including the cancellation of both the second in-class work day and the in-class review and retrospective day. This made communicating ideas between team members significantly more difficult, and I think this has also impacted our performance for this sprint. I still believe that we are working well as a team, and doing the best that we can given the circumstances. Standup participation was 100% for this sprint, which (I believe) is a first for the team.

For sprint planning this time around, we chose the “Offline Login Service” story from the product backlog, as this most closely aligned with what we had begun researching during the second sprint. We broke this story up into tasks, some of which were assigned to everyone on the team, and some of which were assigned to individual members.

One of these tasks, assigned to Dominique, was the “Collaborate with ‘Field Idiots’ to determine how to decrypt and encrypt user data/passwords” task. I am interested in learning more about what Dominique discovered about the encryption implementation that the Field Idiots team will be using, but am unsure if she was able to do much collaboration because of the cancellations.

The “Collaborate with ‘Everyone Else’ about API for retrieving offline data/user information” task was assigned to Luigi and Matt. Once again, I am unsure whether or not they were able to achieve much collaboration due to the cancellations. This collaboration is critical to our progress moving forward, as we must be aware of the requests that we should be sending to the local storage databases in order to implement an offline login.

The “Investigate session management” task was assigned to myself. The main discovery that I made while investigating how the ng2-amrs application handles sessions is that we may not need to change much about the session itself. If the existing code for session management can be modified for usage offline, this would be a far more effective solution than rewriting an entire session manager ourselves.

The remaining tasks were assigned to all of the team members, and were more for big-picture existing implementation understanding and design strategy. The first task was to “Investigate current logon process,” something that we started as a group on the first in-class work day. While we made some progress, I was hoping to use the second in-class work day to share what I had discovered independently and also hear what others had discovered.

The design-strategy task that we created based on the advice of Dr. Wurst was to “Look into ‘Bridge’ design pattern”. I remember looking at the pattern briefly last semester, but needed to refresh myself. I found some online resources that seemed to give a good overview of the pattern and shared them with the rest of the team in our Slack channel.

The final task shared by all team members was the “Create overall architecture/design of offline login feature using Balsamiq.” This task was started during our first in-class work day by Matthew. The design that he created gives a good high-level picture of what our service should accomplish. While I was hoping to discuss possible additions to our design with the rest of the team, this was impacted by the cancellation of class last Thursday.

While there were certainly some hurdles to overcome during the third sprint, I think that we did a good job of making the best of the situation. We communicated more through Slack during this sprint than in previous sprints, and the quality of information that was shared during the standups has improved significantly. Overall, I am happy with our progress and looking forward to getting more in-person collaboration time in the near future.

 

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Recovering from Tragedy

After last week’s post I made a simple click that I’ve made probably hundreds of times before when I updated WordPress on the MassHOSA website. The difference this time, however, was that I wasn’t only updating the WordPress core but also the theme that was currently being used on the website. Depending on the update, you can sometimes get away with updating themes without losing customizations. If, however, the update includes changes to say styles.css, for example, your customizations will be obliterated.

I have nobody to blame but myself for this mistake. I was fully aware that updating a theme could wipe out customizations, and WordPress even has a handy warning on the theme update screen:

But, I was not thinking and clicked update – and lost all of the customizations that I had made to the theme, permanently. Here lies another issue: backups. I should have been making regular backups of both the WordPress filesystem and SQL databases, since at this point I have put a significant amount of effort into both the design (stored mainly on the filesystem in CSS and PHP files) and the content (stored mainly in the SQL database).

While this incident was certainly frustrating, it’s not that tragic. I was able to restore the customizations (this time in a child theme) in a matter of a few hours, and am happier with the outcome this time than I was after my original modifications. Using a child theme forced me to do things a bit differently than the first time around, but at this point I already had a pretty good idea of the changes that needed to be made. I am happy to be doing things the right way now, as my original design would not have been sustainable. Making changes directly to the themes CSS files means that you cannot apply updates as they become available. While this would be fine if updates only ever contained feature additions or style modifications that you were not interested in, this is not advisable when some updates contain security patches. Given the choice between leaving my website vulnerable to attack and losing customizations, I would rather put in a couple of hours of doing customizations the right way than face the repercussions of a breach.

 

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Money Shouldn’t be the (Only) Motivator

While money is certainly important, and should be a consideration – it should never be the primary motivator for choosing a particular career or even for choosing particular projects or roles in a position. Making decisions based on financial considerations alone is a quick way to end up hating what you do and having little opportunity to improve your situation. Making the correct choices about what you use at motivating factors to drive decision-making can help make software craftsmanship a more rewarding and valuable career. Following some of the advice given by Hoover and Oshineye in Apprenticeship Patterns should allow a new or aspiring software craftsman to identify the motivating factors that drive success in the position and lead to a satisfying career.

It is easy to stay motivated when things are going your way and you are working on projects that you find engaging, fun, and adequately challenging without being overly difficult. The problem that the Sustainable Motivations pattern is hoping to address, rather, is when things get a bit more difficult. Sometimes it becomes difficult to see the passion in software craftsmanship. For example, sometimes in the course of developing the necessary technical skills to become a master craftsman, the aspiring develop will be faced with hurdles such as ambiguously specified projects, or shifting and conflicting demands of customers. It’s times like these, argue Hoover and Oshineye, where the motivation and determination of the aspiring craftsman are truly tested.

While I feel lucky to be actively interested and continuously challenged by my current work, I am sure that my luck will run out at some time in the future. I will be presented with a project that I find tedious, frustrating, and maybe even exhausting. I am confident in my ability to remain focused and motivated on working towards the end goal of becoming a craftsman, however. I feel that as Hoover and Oshineye mention, if I ever begin to have doubts they will easily be overcome by a need to continue earning money or the desire to maintain a reputation as a technologist. These motivators should keep me in the craft until my situation improves and passion returns as my primary motivating factor.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.