Author Archives: gmatthew

Preparing to Present

Over the past semester, I’ve been working with Dr. Vallejos to build a website for Massachusetts HOSA. At the conclusion of my independent study project, I will be presenting my project to the Computer Science faculty and other CS students. In preparing for this presentation, I came to a couple of realizations about what I’ve learned from this experience.

While I certainly think that I have improved upon my technical skills in CSS and PHP, I think that what is perhaps more valuable is the immense amount of real-world project management experience that I have gained. This experience has already allowed me to build a better understanding of project requirements at work and for the software development capstone project with AMPATH Informatics. Being able to understand the requirements of stakeholders is essential to delivering a product that meets their expectations. Asking the right questions the first time will prevent having to reach out again and again for clarification of the requirements. People are generally very busy and they will not be available to answer your questions or provide you with information. Whether it is a customer, manager, or product owner, it is best not to waste other people’s time with comeback questions because of your own failure to fully consider the project’s requirements.

I also believe that I greatly improved my personal software development process throughout this project. Although it took a couple of mistakes for me to learn, I am thankful that I made these mistakes in a safe environment and lost nothing but a few hours of my time. I was initially pretty careless, making customization changes to the theme files directly on the web server itself, not backing up, and not tracking any of my changes. After losing all of my theme customizations by updating the theme, I decided to make some changes to this process. I implemented Git version control, allowing me to make and test changes locally before pushing to the actual website as well as tracking changes incrementally and allowing me to rollback to any revision, as desired. I also implemented automatic offsite backup in Google Drive, which runs weekly to ensure that even if I do mess something up, there’s always a working copy safely stored elsewhere.

I have always been an avid believer in learning through experience, and the MassHOSA website project has been a fantastic opportunity to learn through my experiences. Not only have I had the chance to both sharpen my technical skills and widen my skill set, I have gained invaluable experience managing a project and working with stakeholders on bringing an idea from the conceptual phase through to a working product.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Looking Back on the Final Working Sprint

Just like that, I am writing the final sprint retrospective for our capstone project working with AMPATH Informatics! I have learned so much about the development process, contributing to open source software, and especially working in a Scrum development team. I am extremely grateful for the opportunity to work with the AMPATH developers, my team, who were extremely helpful and made this experience valuable. I am thankful that the software development capstone has allowed me to contribute to a real-world application, and hopefully make some small difference.

In terms of concrete tasks for this sprint, I took on fewer development stories than in previous sprints, and chose to focus more on documenting and producing tests for what I have already contributed. Other team members were assigned develop tasks, however, and our implementation of an offline login is close to being a shippable product. The main item blocking a pull request to AMPATH is the lack of encryption for the stored credentials. They are currently only encoding in base64, which is about as good as plain text. We are unsure whether a working encryption implementation will be available before time runs out.

One of the development tasks that I took on for this sprint was to update the refresh time of the online tracker indicator on the bottom of the screen. When Dominique added a checkbox element to the user interface, she used a subscription that updated every three seconds. As a result, the checkbox would appear even when the online tracker indicated that the user was still offline. To fix this, I updated the refresh time of the online tracker component to match the subscription of the checkbox.

Another task that was assigned to Dominique and Luigi for this sprint was to implement the backend logic for the checkbox. While there are still some bugs, we should be able to work through them as a team if they are not resolved by the time we meet for review and retrospective. This logic should store credentials in localStorage only when the checkbox is checked.

A task that was assigned to Matt was to document the current status of our offline login implementation. During our in-class meetings, I discussed with Matt how my implementation works and where I made changes to the code in order to allow the user to login offline. I think that we will continue this task into the final presentation preparation sprint, where we all will be documenting our contributions.

Kwame was assigned the task of looking at writing tests for the offline login implementation. While he had some trouble writing tests, I think that this is something that we can all work on in the final sprint as part of documenting what we have done. I think that it might be easier to write tests for the code that we have contributed as individuals, rather than assigning all of the test writing to one person.

As mentioned earlier, we are waiting on an encryption service for the storage of user credentials from another team. While we were able to accomplish most of the requirements for the offline login implementation, the lack of encryption has kept us from submitting much of anything to AMPATH. Storing the user’s credentials in plain text is far too risky from a security standpoint, and I am doubtful that the developers would accept our implementation without encryption.

I am very happy with the progress that we’ve made as a team. I have certainly improved from the beginning of the semester, and it has been great to see other members of the team improve as well. I’m looking forward to the last sprint where we will compile all of what we have learned and implemented into a presentation.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Thinking Like an End User

I am getting ready to deliver a website product that I have been working on for Massachusetts HOSA. Because I’ve been working on the development of the website for the past couple of months, I am familiar with where to find everything. Once the product is delivered, however, it will be updated and maintained by Massachusetts HOSA. While I would be perfectly happy to continue helping out with the website as needed, I would like to minimize the need for my involvement by making the website as self-sustaining as possible.

In previous blog posts, I already outlined the setup of automatic backups. This makes me feel much better about enabling automatic updates for the WordPress installation. With automatic upgrades enabled, the site will be kept secure and up to date as WordPress and plugin or theme developers release new versions. I would be weary of allowing automatic updates if I was unsure of whether or not there were current backups because of the possibility of an update breaking the site. Occasionally there are incompatibilities between different plugin/WordPress version combinations, or simply bugs in a release that could make the site unstable. In case of such a scenario, having a recent backup that can quickly be rolled back to is essential.

The second part of making the site self-sustaining is to write documentation for the use of this specific WordPress installation. While WordPress is already extremely well documented, this vast documentation can sometimes be difficult to navigate efficiently. I would like to pick and choose the essentials to include in a slimmed-down version of documentation to provide to MassHOSA as a guide for the maintenance and updating of the website. This documentation will include guides for use of the WordPress platform, use of the various plugins that are installed, and also references to the locations of various resources such as backups and styling files.

I am extremely thankful for the opportunities that working on this project has granted me. While I may have had some prior experience building WordPress websites, this was quite different. I got a much better idea of the various stages of a design project and experience working directly with stakeholders to turn specifications into a working, real-world implementation.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Looking Back on Sprint 5

Before heading into the last working sprint, I would like to reflect on how happy I am with the progress made by all members of the team during the fifth sprint. It would seem that my hope that getting a rudimentary implementation of the offline login pushed up to the team repository did generate some buzz and get other team members pushing code as well. I’m looking forward to seeing what we can accomplish in the final push.

One of the story items that I was assigned for this sprint was the “Take offline status checking outside of error checking” task. This was not too difficult to accomplish, but did require some investigation into how to make use of a service in TypeScript. While it seems like a relatively basic concept, it was the first time I had ever attempted to use a service. Following the examples set by the AuthenticationService and SessionService, I managed to piece together how services are imported, included as part of the constructor, and then used within the class.

While working on changing my rudimentary implementation of the offline login to use the OnlineTrackerService rather than the error checking logic of the LoginComponent, I discovered a bug. Because I needed the OnlineTrackerService working for my offline login implementation, I chose to investigate and attempt to fix this bug. The bug was not difficult to identify, and I fixed it relatively quickly. After attempting to build the project, however, one of the tests failed. I looked into the failing test, and found that the test was expecting the incorrect value for isUpdating when offline, causing the test to fail when the program was behaving properly. I corrected this test and submitted a pull request to AMPATH. The pull request can be seen here: https://github.com/AMPATH/ng2-amrs/pull/671. Just a few hours before writing this post, the pull request was approved and merged into AMPATH’s master. While this task was not originally assigned at the beginning of the sprint and was done out of necessity, I feel that it was important contribution. For this reason, I added it to the Trello board as a story item, and marked it as complete once the pull request was accepted.

Although I was a bit distracted during the middle of the sprint by the OnlineTracker bug, I got back down to business with our offline login implementation once the bugfix code was submitted to AMPATH. I noticed that the build had failed for the code that I pushed that took the offline status checking outside of error checking, and set out to investigate why. The description given by the testing framework was vague, but I eventually determined that the test was failing because of a missing import in the login.component.spec.ts class. Because I was implementing the OnlineTrackerService in login.component.ts, I also had to import the service in the test class. Once I added this import, the build passed.

The final task that I took on for the sprint was fixing our severely broken team git tree. Because we did not branch for development of the offline-login, our master branch became cluttered with many small, meaningless commits and merge commits. This became apparent when I attempted to submit a pull request to AMPATH for my bugfix. I worked with Matt in class to get all of the commits related to the offline-login in a new offline-login branch. He then developed guidelines for a git workflow for the team repository.

I’m very excited about what we’ve accomplished this sprint. It is rather unfortunate that just as we are getting up to speed and comfortable working with the ng2-amrs application, we are entering our final working sprint. I’m still hopeful that we will be able to make significant progress towards a working implementation in the last sprint.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Transitioning

After meeting to discuss the current status of the website, there are only a few tasks that remain. Although I am still waiting on some of the design content such as images and social media links to be provided, I think that the website design will soon be wrapped up. Once this happens, the next step will be education and training on the use and maintenance of the site. This should not be too intense because of how intuitive WordPress is to use.

Part of this training will likely involve the transfer of the hosting off of my personal virtual server to a permanent host. While I do not mind hosting the site for the time being, I do not want to be responsible in the case that my server goes down. When using a well known hosting provider, you are paying for someone to take on this responsibility. I have prepared the site to be migrated, and I do not anticipate any issues with migration. WordPress is rather portable, not requiring much more than a few directories and a small database.

One item that still needed to be addressed was the size of the font used on the website. Although it appeared appropriate on my screen, it was difficult to read from a distance on higher resolution monitors. While I had tested the website in a few different browsers and even on my mobile phone, none of these allowed me to view the site as if I were using a higher resolution monitor. During the meeting, when viewing the site at a higher resolution, the text appeared to be “zoomed out” and was difficult to read in some of the lower contrast areas of the page.

The next thing that I will be looking at for the MassHOSA project is QuickBase. I am familiar with the platform because of an internship where I am currently auditing and validating user access to QuickBase. Despite this familiarity, there may be a few obstacles to making the desired changes. After a quick inspection of the application, many of the features required to make the desired changes to the application are blocked due to the QuickBase tier in use. I will be looking for workarounds and discussing the potential solutions during my next meeting.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Preparing to Migrate a WordPress Site

Now that I’ve got a functional website built for the MassHOSA project, it is time to start preparing to move it the website to its permanent home. Development has been straightforward partly because it has taken place while the server has been living on my personal virtual private server. With full SSH access to the development server, it was much easier to make server-side tweaks to various environment settings. Many of these tweaks had more to do with my server being misconfigured than with WordPress, however. I am hopeful that the permanent hosting environment that is selected will require minimal modifications. Many of the hosts that we’ve looked at, for example, have environments tailored specifically for WordPress hosting.

To begin preparing, I copied the entire WordPress directory to my local machine using SCP. While this took some time, I wanted to be sure that everything was transferred and remained intact. I did not necessarily trust that FTP was up to the task, as I have had some problems with file integrity after using FTP for large-scale file transfers. While there may have been many other contributing factors, I thought I would try SCP instead this time, at least for the downloading of the website files to my local computer. FTP may be the only option for uploading the files to the new host, as many shared hosts do not allow SSH access.

The next step of the preparation process was to export and download the contents of the database associated with the installation. Choosing how I export the tables is important, because of the limited privileges that may be available for importing the data on the new host. To ensure that I would be able to import the tables on the new host, I used the account used by WordPress to access the database, and exported all of the tables in the database. This way, even if the new host allows only one database, I will be able to migrate all of the necessary tables and simply update the wp-config file to point to the correct database.

Thankfully, if anything goes wrong during the setup of the site on the new host, I have the working installation on my virtual server to fall back on while working things out. I hope that I have not overlooked anything and that the migration will be straightforward and painless.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Looking Back on Sprint 4

I can’t believe that we’ve already completed four sprints! I am happy with the progress and minor breakthrough that we made during this sprint. It took a good deal of researching, trial and error, and investigation to gain the base of knowledge required to implement the rudimentary offline login authorization that I pushed to the team’s repository this sprint. I’m hoping that getting some code pushed up to the repository will be an impetus for driving renewed energy in sprint planning and the next sprint. Other things that we got done this sprint were important for ensuring that we remain on track to integrate our work with the work of the other teams.

Some of the tasks that we completed during this sprint are no longer applicable because of how we have decided to move forward in development. One task that is no longer applicable is the “Locate server code” task. Locating the server code is no longer applicable because we have implemented the offline authentication in a way mimics a response from the server rather than mocking the server itself.

Another task that may or may not be applicable are the “Install PouchDb on AMPATH app” and “Create ‘Mock’ of PouchDB (dependent on PouchDB investigation)” tasks. These tasks were certainly worthwhile even if we do not proceed with implementing the storage of credentials using PouchDB because the offline data storage team appears to be using PouchDB. We will likely move forward with storing the login credentials using localStorage because there is far less overhead and services already written into the ng2-amrs application.

While we attempted the “Contact APMATH team to determine if we are taking the correct approach” task, we have not yet heard back.

One of my assigned tasks for this sprint was “Store user credentials in localStorage so the user can login offline.” I managed to store both the “auth.Credentials” and “user” objects in localStorage once the user successfully logged in online. This user information is then later extracted from localStorage when the user attempts to login offline and used to create a session that authenticates the user.

Tasks that we will be carrying over to future sprints include “Create back end design of new UI using Balsamiq,” “Collaborate with ‘Everyone Else’,” and “Collaborate with ‘Field Idiots’ (Sprint to Sprint).” These tasks will be carried over because they’ll never really be completed, they are ongoing.

Although many of the tasks for this sprint may end up being no longer applicable moving forward, the team seems to have been productive during this sprint. In planning for the next sprint, I think it will be much easier to assign discrete tasks now that we have a basic implementation on which we can build functionality and add features. I’m hopeful that we will soon be knocking off todo items and have a fully functioning offline authentication in no time.

 

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Using Breakable Toys

I am a strong believer in not being afraid to fail. Failure is how we learn and improve. If you are not facing failure, then perhaps you are not pushing yourself hard enough. It is difficult to grow as an individual and certainly as a software developer if you are not pushing your personal limits. Pushing the limits of what you are comfortable with will inevitably mean failure at one point or another. It is important to know that failure is perfectly acceptable, and learning from your failures helps you to grow.

I was relieved to learn that the ideas presented in Hoover and Oshineye’s Apprenticeship Patterns aligned so well with my personal thinking. The Breakable Toys pattern specifically mentions not being afraid to fail, and gives advice on creating a safe environment to try things. Because it would be dangerous and risky to do your experimenting at work, Hoover and Oshineye recommend creating a safe space. What you create in your safe space should be relevant to your work as an apprentice, and similar in toolset but smaller in scope.

While I have made quite a few of my own programs just messing around to gain familiarity with a particular subject or idea, the specific nature of these programs makes them lose relevance quickly. They are abandoned shortly after they serve their purpose of familiarizing me with an idea. I like the solution presented by Hoover and Oshineye to create software such as a wiki, game, blog, or IRC client. These types of software will not lose their relevancy, as they can be continuously used and further developed. New features can be added that not only serve practical uses, but allow for new opportunities for learning and practice.

Creating software for personal use is far less risky than playing around on company time. If you do it right, you may even get something useful in addition to the knowledge you gain from your failures. I am looking forward to attempting to develop one of the suggested tools suggested in the Breakable Toys pattern. While I do not think I will be writing any software that lives up to Torvald’s breakable toy, I can certainly appreciate the value in trying, and also in failing.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Using Git with WordPress

As part of my continued efforts to not lose all of my hard work, I’m implementing tools to help me track changes and have decided to use version control to do it. I’ve chosen to use Git because of my relative familiarity with the tool.

For a bit of background, my web server is running Ubuntu 16.04.3 LTS and the latest version of WordPress at the time of this writing, version 4.9.4. Because GitLab allows for free private repositories and the nature of the project makes a public repository undesirable, it was chosen over GitHub. One thing to note about this setup is that I have full shell access to the server, allowing me to install programs and edit properties as necessary to get things setup. When the website is eventually migrated to its permanent hosting location, some changes may be necessary to the following setup to accommodate the server implementation. Many shared hosting providers do not allow shell access, and a new strategy would need to be considered in this case.

I started the setup by performing a bit of housekeeping with

sudo apt-get update

and then performed the initial Git installation with

sudo apt-get install git

I then performed the usual Git setup, uploading my SSH user’s key to GitLab and setting my username/email with

git config –global user.name “Your Name

git config –global user.email “youremail@domain.com

After cd’ing to the directory of the website files, I issued the command

git remote add origin git@gitlab.com:MassHOSA/masshosa-website.git

An important step here is to make sure that no sensitive files are tracked by Git. I did this by adding a .gitignore with the following:

#————————

#  Main ignored items

#————————

/../wp-config.php

/wp-config.php

.maintenance

versionpress.maintenance

/.htaccess

/web.config

/wp-content/*

!/wp-content/db.php

!/wp-content/index.php

!/wp-content/plugins/

/wp-content/plugins/versionpress/

!/wp-content/mu-plugins/

!/wp-content/themes/

!/wp-content/languages/

!/wp-content/uploads/

!/wp-content/vpdb/

#————————

#  Log files

#————————

*.log

error_log

access_log

#————————

#  OS Files

#————————

.DS_Store

.DS_Store?

._*

.Spotlight-V100

.Trashes

ehthumbs.db

*[Tt]humbs.db

*.Trashes

at this point it was safe to issue a

git add .

and commit with

git commit -m “Initial commit”

and finally push changes with

git push –set-upstream origin master

And that’s all there was to it. I’m now tracking all of the changes that I’m making to theme and plugin files. These are the only files that I really care about reverting and recovering changes that I’ve made. Everything else is backed up regularly using Updraft.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.

Using Git with WordPress

As part of my continued efforts to not lose all of my hard work, I’m implementing tools to help me track changes and have decided to use version control to do it. I’ve chosen to use Git because of my relative familiarity with the tool.

For a bit of background, my web server is running Ubuntu 16.04.3 LTS and the latest version of WordPress at the time of this writing, version 4.9.4. Because GitLab allows for free private repositories and the nature of the project makes a public repository undesirable, it was chosen over GitHub. One thing to note about this setup is that I have full shell access to the server, allowing me to install programs and edit properties as necessary to get things setup. When the website is eventually migrated to its permanent hosting location, some changes may be necessary to the following setup to accommodate the server implementation. Many shared hosting providers do not allow shell access, and a new strategy would need to be considered in this case.

I started the setup by performing a bit of housekeeping with

sudo apt-get update

and then performed the initial Git installation with

sudo apt-get install git

I then performed the usual Git setup, uploading my SSH user’s key to GitLab and setting my username/email with

git config –global user.name “Your Name

git config –global user.email “youremail@domain.com

After cd’ing to the directory of the website files, I issued the command

git remote add origin git@gitlab.com:MassHOSA/masshosa-website.git

An important step here is to make sure that no sensitive files are tracked by Git. I did this by adding a .gitignore with the following:

#————————

#  Main ignored items

#————————

/../wp-config.php

/wp-config.php

.maintenance

versionpress.maintenance

/.htaccess

/web.config

/wp-content/*

!/wp-content/db.php

!/wp-content/index.php

!/wp-content/plugins/

/wp-content/plugins/versionpress/

!/wp-content/mu-plugins/

!/wp-content/themes/

!/wp-content/languages/

!/wp-content/uploads/

!/wp-content/vpdb/

#————————

#  Log files

#————————

*.log

error_log

access_log

#————————

#  OS Files

#————————

.DS_Store

.DS_Store?

._*

.Spotlight-V100

.Trashes

ehthumbs.db

*[Tt]humbs.db

*.Trashes

at this point it was safe to issue a

git add .

and commit with

git commit -m “Initial commit”

and finally push changes with

git push –set-upstream origin master

And that’s all there was to it. I’m now tracking all of the changes that I’m making to theme and plugin files. These are the only files that I really care about reverting and recovering changes that I’ve made. Everything else is backed up regularly using Updraft.

From the blog CS@Worcester – ~/GeorgeMatthew/etc by gmatthew and used with permission of the author. All other rights reserved by the author.