Author Archives: Michael Duquette

Angular / GoogleSheets / CRUD

We need to access a Google Sheet for basic CRUD operations from Angular. There are several great node modules available that will allow us to perform CRUD operations on Google Sheets but most require OAuth2 and we want to keep this fairly straight forward by using a Google Service Account and JWT (JSONWeb Token).

The best fit module for this is:
[google-spreadsheet](https://www.npmjs.com/package/google-spreadsheet)
This handy little module takes care of the authentication as well as providing the hooks to the Google Sheet. But since this is a Node.js module we are trying to run in a browser we have to create a workaround. Here’s a quick overview of what we need to do to get this up and running. First we’ll enable the Google Sheets API and create a servive account. Next we’ll extend Webpack in Angular to help setup our workaround. Then we’ll configure our components and, for this example, write a method that writes to the Google Sheet.

This is going to be a little lengthy so I won’t go over creating an Angular project. All right let’s get started!

First enable the Google Sheets API and create the Service Account:

Go to the Google Developers Console and navigate to the API section. You should see a dashboard.

Click on “Enable APIs” or “Library” which should take you to the library of services that you can connect to. Search and enable the Google Sheets API.

Go to Credentials and select “Create credentials”.

Select “Service Account” and proceed forward by creating this service account. Name it whatever you want. I used SheetBot.

Under “Role”, select Project > Owner or Editor, depending on what level of
access you want to grant.

Select JSON as the Key Type and click “Create”. This should automatically
download a JSON file with your credentials.

Rename this credentials file as credentials.json and create a sheets directory in your src/assets directory of your project.

The last super important step here is to take the “client email” that is in your credentials file and grant access to that email in the sheet that you’re working in. If you do not do this, you will get an error when trying to access the sheet.

Configure Angular:

Now let’s start configuring the Angular project to play nice with the Node.js packages we’ll be installing.

Edit the tsconfig.app.json and “node” to the “types”:[] section and paste this right below it “typeRoots”: [ “../node_modules/@types” ]

The two should look like this:

“`
“types”: [ “node” ],
“typeRoots”: [ “../node_modules/@types” ]
“`

***Dont forget your commas***

Since we’ll be mocking Node.js we need to add in the Node typings to the angular project. Install them by running this from the terminal:

“`
npm install @types/node –save
“`

Now let’s extend Webpack. We’ll be following some of the steps that
Vojtech Masek provides in this [article](https://medium.com/angular-in-depth/google-apis-with-angular-214fadb8fbc5?)

Install the Angular Custom-Webpack: “`
npm i -D @angular-builders/custom-webpack
“`
Now we have to tell Angular to use the correct builders for the custom webpack. Open up angular.json and replace the builder in architect with:

“`
“builder”: “@angular-builders/custom-webpack:browser”,
“`
then paste this in to the options section right below:
“`
“customWebpackConfig”: {
“path”: “./extra-webpack.config.js”
},
“`

It should look like this:
“`
“architect”: {
“build”: {
“builder”: “@angular-builders/custom-webpack:browser”,
“options”: {
“customWebpackConfig”: {
“path”: “./extra-webpack.config.js”
},
“`

and under serve replace builder with:
“`
“builder”: “@angular-builders/custom-webpack:dev-server”,
“`
It should look like this:
“`
“serve”: {
“builder”: “@angular-builders/custom-webpack:dev-server”,
“`
More details about using Custom Webpacks can be found in Angular’s builder
[docs](https://github.com/just-jeb/angular-builders/tree/master/packages/custom-webpack#Custom-webpack-dev-server).

In your projects root directory create a new javascript file and name it:

extra-webpack.config.js

paste this code into it:
“`
const path = require(‘path’);

module.exports = {
resolve: {
extensions: [‘.js’],
alias: {
fs: path.resolve(__dirname, ‘src/mocks/fs.mock.js’),
child_process: path.resolve(
__dirname,
‘src/mocks/child_process.mock.js’
),
‘https-proxy-agent’: path.resolve(
__dirname,
‘src/mocks/https-proxy-agent.mock.js’,
),
},
},
};`
“`

So what does all of that do? We are telling WebPack to use the mock javascript files instead of trying to call additional Node.js modules. Let’s create the mocks, first create a folder in the projects src folder and name it mocks. In the mocks folder create three javascript files:

child_process.mock.js
fs.mock.js
http-proxy-agent.mock.js

paste this code into the mocks for child_process and fs:

“`
module.exports = {
readFileSync() {},
readFile() {},
};
“`

For the http-proxy mock use this code:
“`
module.exports = {};
“`

These mock methods let the node.js modules think they are running correctly but
in reality don’t do anything. Now we need to provide a way for node to
handle process and Buffer calls since node isn’t able to access the global
variables from the browser side. To do so install these two packages:
“`
npm i -D process buffer
“`
Now add these to the Application imports in polyfills.ts:
“`
import * as process from ‘process’;
(window as any).process = process;

import { Buffer } from ‘buffer’;
(window as any).Buffer = Buffer;
“`

and add this to the head section of index.html:
“`

“`

Ok almost there! At this point we have Angular configured with the mocks and are ready to install a few more modules. First let’s install google-spreadsheet:
“`
npm i google-spreadsheet –save
“`
Depending on the platform you are on you may receive some warning errors
indicating that optional fsevent was not being installed. Since it’s listed as an optional module I ignored it. I’m working on a Windows 10 device and had to install these modules to make the compiler happy:

eslint
fs
child_process
net
tls

Wait, didnt we just mock fs and child_process? Yes but the compiler still sees them listed as dependencies and wants them installed. Now that we have everything installed and configured let’s try it out.

Wrapping up

I added a contact component and created a contact form with an onSubmit function. The onSubmit function passes the jsonObject to the addRows method for the Google Sheet. Here’s what my contact.component.html looks like:

“`

Sheet Crud Example








“`

Not very elegant, and yes I need to add formReset to it, but it gets the job done for now. For the contact.component.ts I added these two imports first:
“`
import { FormGroup, FormBuilder, FormControl } from ‘@angular/forms’;
import {HttpClient} from ‘@angular/common/http’;
“`
The Form imports are to build the form while the HttpClient will be used by the google-spreadsheet module for sending the JWT for authenticating and for our CRUD operations.

I then added the node related cont’s:
“`
const GoogleSpreadsheet = require(‘google-spreadsheet’);
const creds = require(‘../../assets/sheet-api/credentials.json’);
“`
If you forgot to install the node types (npm i @types/node) you will get an error becasue TypeScript doesn’t use require. If you get a message telling you to convert the require to an import just ignore it.

Next I configured my constructor:
“`
constructor(private formBuilder: FormBuilder, private http: HttpClient, ) {
this.contactForm = new FormGroup({
fullName: new FormControl(),
email: new FormControl(),
message: new FormControl()
});
}
“`

Then I setup the onSubmit method:

“`
contactForm: FormGroup;

onSubmit() {
const jsonObject = this.contactForm.value;
console.log(‘Your form data : ‘, this.contactForm.value);
const doc = new GoogleSpreadsheet(‘***YOUR GOOGLE SHEETID***’);
doc.useServiceAccountAuth(creds, function(err) {
doc.addRow(1, jsonObject, function (err) {
if (err) {
console.log(err);
}
});
});
}

“`
So what exactly are we doing here? Well, we take the current contactForm values and assign them to jsonObject. Then we put them out to the log and create a new GoogleSpreadSheet. It’s important that you replace ***YOUR GOOGLE SHEETID*** with the actual ID of the Google Sheet you are trying to work with. You can find it by opening up the Google Sheet in your browser. The ID is that really long hash string between the /d/ and the /edit:

https://docs.google.com/spreadsheets/d/***2he8jihW5d6HGHd3Ts87WdRKqwUeH-R_Us8F3xZQiR***/edit#gid=0

doc then calls the useServiceAccountAuth method in google-spreadsheet and passes the credentials.json as a wrapper to call addRow. This authenticates the session and let’s us add a row to the existing sheet. If you have the browser console open you will see a couple of warnings. The first warning (which looks intimidating but is not) is a compliler warning:

Critical dependency: the request of a dependency is an expression

You may also see a console log that says:

Error: incorrect header check

This is because of the function (err) in the useServiceAccountAuth method. The function (err) is a node error callback that Angular cannot process correctly and we haven’t coded around.

…and that wraps it up. Why am I writing a message like this to a Google Sheet instead of using an email component? I’m using Google Sheets as the backend for a webapp and I use a script in Google Sheets to forward the message as an email and for logging purposes.

Check out the google-spreadsheet [repository](https://github.com/theoephraim/node-google-spreadsheet)
for additional details on the calls you can make and how to use the module.

From the blog Home | Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

Angular / GoogleSheets / CRUD

We need to access a Google Sheet for basic CRUD operations from Angular. There are several great node modules available that will allow us to perform CRUD operations on Google Sheets but most require OAuth2 and we want to keep this fairly straight forward by using a Google Service Account and JWT (JSONWeb Token).

The best fit module for this is:
[google-spreadsheet](https://www.npmjs.com/package/google-spreadsheet)
This handy little module takes care of the authentication as well as providing the hooks to the Google Sheet. But since this is a Node.js module we are trying to run in a browser we have to create a workaround. Here’s a quick overview of what we need to do to get this up and running. First we’ll enable the Google Sheets API and create a servive account. Next we’ll extend Webpack in Angular to help setup our workaround. Then we’ll configure our components and, for this example, write a method that writes to the Google Sheet.

This is going to be a little lengthy so I won’t go over creating an Angular project. All right let’s get started!

First enable the Google Sheets API and create the Service Account:

Go to the Google Developers Console and navigate to the API section. You should see a dashboard.

Click on “Enable APIs” or “Library” which should take you to the library of services that you can connect to. Search and enable the Google Sheets API.

Go to Credentials and select “Create credentials”.

Select “Service Account” and proceed forward by creating this service account. Name it whatever you want. I used SheetBot.

Under “Role”, select Project > Owner or Editor, depending on what level of
access you want to grant.

Select JSON as the Key Type and click “Create”. This should automatically
download a JSON file with your credentials.

Rename this credentials file as credentials.json and create a sheets directory in your src/assets directory of your project.

The last super important step here is to take the “client email” that is in your credentials file and grant access to that email in the sheet that you’re working in. If you do not do this, you will get an error when trying to access the sheet.

Configure Angular:

Now let’s start configuring the Angular project to play nice with the Node.js packages we’ll be installing.

Edit the tsconfig.app.json and “node” to the “types”:[] section and paste this right below it “typeRoots”: [ “../node_modules/@types” ]

The two should look like this:

“`
“types”: [ “node” ],
“typeRoots”: [ “../node_modules/@types” ]
“`

***Dont forget your commas***

Since we’ll be mocking Node.js we need to add in the Node typings to the angular project. Install them by running this from the terminal:

“`
npm install @types/node –save
“`

Now let’s extend Webpack. We’ll be following some of the steps that
Vojtech Masek provides in this [article](https://medium.com/angular-in-depth/google-apis-with-angular-214fadb8fbc5?)

Install the Angular Custom-Webpack: “`
npm i -D @angular-builders/custom-webpack
“`
Now we have to tell Angular to use the correct builders for the custom webpack. Open up angular.json and replace the builder in architect with:

“`
“builder”: “@angular-builders/custom-webpack:browser”,
“`
then paste this in to the options section right below:
“`
“customWebpackConfig”: {
“path”: “./extra-webpack.config.js”
},
“`

It should look like this:
“`
“architect”: {
“build”: {
“builder”: “@angular-builders/custom-webpack:browser”,
“options”: {
“customWebpackConfig”: {
“path”: “./extra-webpack.config.js”
},
“`

and under serve replace builder with:
“`
“builder”: “@angular-builders/custom-webpack:dev-server”,
“`
It should look like this:
“`
“serve”: {
“builder”: “@angular-builders/custom-webpack:dev-server”,
“`
More details about using Custom Webpacks can be found in Angular’s builder
[docs](https://github.com/just-jeb/angular-builders/tree/master/packages/custom-webpack#Custom-webpack-dev-server).

In your projects root directory create a new javascript file and name it:

extra-webpack.config.js

paste this code into it:
“`
const path = require(‘path’);

module.exports = {
resolve: {
extensions: [‘.js’],
alias: {
fs: path.resolve(__dirname, ‘src/mocks/fs.mock.js’),
child_process: path.resolve(
__dirname,
‘src/mocks/child_process.mock.js’
),
‘https-proxy-agent’: path.resolve(
__dirname,
‘src/mocks/https-proxy-agent.mock.js’,
),
},
},
};`
“`

So what does all of that do? We are telling WebPack to use the mock javascript files instead of trying to call additional Node.js modules. Let’s create the mocks, first create a folder in the projects src folder and name it mocks. In the mocks folder create three javascript files:

child_process.mock.js
fs.mock.js
http-proxy-agent.mock.js

paste this code into the mocks for child_process and fs:

“`
module.exports = {
readFileSync() {},
readFile() {},
};
“`

For the http-proxy mock use this code:
“`
module.exports = {};
“`

These mock methods let the node.js modules think they are running correctly but
in reality don’t do anything. Now we need to provide a way for node to
handle process and Buffer calls since node isn’t able to access the global
variables from the browser side. To do so install these two packages:
“`
npm i -D process buffer
“`
Now add these to the Application imports in polyfills.ts:
“`
import * as process from ‘process’;
(window as any).process = process;

import { Buffer } from ‘buffer’;
(window as any).Buffer = Buffer;
“`

and add this to the head section of index.html:
“`

“`

Ok almost there! At this point we have Angular configured with the mocks and are ready to install a few more modules. First let’s install google-spreadsheet:
“`
npm i google-spreadsheet –save
“`
Depending on the platform you are on you may receive some warning errors
indicating that optional fsevent was not being installed. Since it’s listed as an optional module I ignored it. I’m working on a Windows 10 device and had to install these modules to make the compiler happy:

eslint
fs
child_process
net
tls

Wait, didnt we just mock fs and child_process? Yes but the compiler still sees them listed as dependencies and wants them installed. Now that we have everything installed and configured let’s try it out.

Wrapping up

I added a contact component and created a contact form with an onSubmit function. The onSubmit function passes the jsonObject to the addRows method for the Google Sheet. Here’s what my contact.component.html looks like:

“`

Sheet Crud Example








“`

Not very elegant, and yes I need to add formReset to it, but it gets the job done for now. For the contact.component.ts I added these two imports first:
“`
import { FormGroup, FormBuilder, FormControl } from ‘@angular/forms’;
import {HttpClient} from ‘@angular/common/http’;
“`
The Form imports are to build the form while the HttpClient will be used by the google-spreadsheet module for sending the JWT for authenticating and for our CRUD operations.

I then added the node related cont’s:
“`
const GoogleSpreadsheet = require(‘google-spreadsheet’);
const creds = require(‘../../assets/sheet-api/credentials.json’);
“`
If you forgot to install the node types (npm i @types/node) you will get an error becasue TypeScript doesn’t use require. If you get a message telling you to convert the require to an import just ignore it.

Next I configured my constructor:
“`
constructor(private formBuilder: FormBuilder, private http: HttpClient, ) {
this.contactForm = new FormGroup({
fullName: new FormControl(),
email: new FormControl(),
message: new FormControl()
});
}
“`

Then I setup the onSubmit method:

“`
contactForm: FormGroup;

onSubmit() {
const jsonObject = this.contactForm.value;
console.log(‘Your form data : ‘, this.contactForm.value);
const doc = new GoogleSpreadsheet(‘***YOUR GOOGLE SHEETID***’);
doc.useServiceAccountAuth(creds, function(err) {
doc.addRow(1, jsonObject, function (err) {
if (err) {
console.log(err);
}
});
});
}

“`
So what exactly are we doing here? Well, we take the current contactForm values and assign them to jsonObject. Then we put them out to the log and create a new GoogleSpreadSheet. It’s important that you replace ***YOUR GOOGLE SHEETID*** with the actual ID of the Google Sheet you are trying to work with. You can find it by opening up the Google Sheet in your browser. The ID is that really long hash string between the /d/ and the /edit:

https://docs.google.com/spreadsheets/d/***2he8jihW5d6HGHd3Ts87WdRKqwUeH-R_Us8F3xZQiR***/edit#gid=0

doc then calls the useServiceAccountAuth method in google-spreadsheet and passes the credentials.json as a wrapper to call addRow. This authenticates the session and let’s us add a row to the existing sheet. If you have the browser console open you will see a couple of warnings. The first warning (which looks intimidating but is not) is a compliler warning:

Critical dependency: the request of a dependency is an expression

You may also see a console log that says:

Error: incorrect header check

This is because of the function (err) in the useServiceAccountAuth method. The function (err) is a node error callback that Angular cannot process correctly and we haven’t coded around.

…and that wraps it up. Why am I writing a message like this to a Google Sheet instead of using an email component? I’m using Google Sheets as the backend for a webapp and I use a script in Google Sheets to forward the message as an email and for logging purposes.

Check out the google-spreadsheet [repository](https://github.com/theoephraim/node-google-spreadsheet)
for additional details on the calls you can make and how to use the module.

From the blog Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

Mockito, not the lovely beverage Mojito

Does your unit test have a stunt double? You know someone to jump in there and take the hit during testing instead of using real dependencies? Well, it could if you used Mockito. I found a great tutorial on setting up Mockito for various environments and situations here: https://www.vogella.com/tutorials/Mockito/article.html But what is Mockito? Don’t confuse it with that lovely mint based rum concoction. It’s a JAVA based mocking framework. It’s used to mock interfaces so that dummy functionality can be used in unit testing. But what is mocking? In OOP mock objects are simulated objects that mimic the behavior of real objects (stunt doubles!). In Test Driven Development mock objects meet interface requirements of real objects allowing developers to write and unit-test functionality. This allows the developers to focus on the behavior of the system while testing without worrying about dependencies. This Martin Fowler article does a great job explaining Mocks: https://martinfowler.com/articles/mocksArentStubs.html So grab a Mojito and enjoy some reading! #CS@Worcester #CS-443

From the blog Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

Gradle Clover Part 2

Before we get started there a couple of assumptions I am making: You have a current Gradle project and you want add Code Coverage testing You are using Junit 5 (Jupiter) for testing Ok first and for most backup your existing project. If following my steps causes you to screw something up that’s on you. Especially since my first step in this adventure clearly states backup your existing project! Backup complete? Ok lets move on. Go to github and fork the gradle-clover-plugin repository: https://github.com/bmuschko/gradle-clover-plugin Now open your projects build.gradle and the plugins build.gradle. Wildly different right?! Now merge the two. My project didn’t have any funky dependencies or imports so I literally copied the plugins build.gradle and overwrote mine. There are about nine different plugins used for the Gradle Clover Plugin. Read the projects build.gradle and understand it. This will help assure there are no failures. Remember I did tell you to backup your project before starting. Once you sort out your build.gradle look in the Gradle Clover Plugin repository and open up the gradle folder. Copy the six *.gradle files to your projects gradle folder. These should be: additional-artifacts.gradle documentation.gradle functional-testing.gradle integration-test.gradle publishing.gradle release.gradle Now back to the Gradle Clover Plugin repository copy the src folder to your project and overwrite your src folder. You will now have three folders in your src folder: funcTest main test funcTest contains all the functional testing scripts while main and test contain an additional groovy folder with scripts specific for Groovy. Now if you just overwrote your src directory your code should still be in the java folders under main and test. If not go to your backup and copy them over. This is why we do backups. Now once again back to the Gradle Clover Plugin repository. Copy the gradle.properties file over from the root of the repository to the root of your repository. Open it up with your favorite editor (I like to use Notepad++ ) and check it out. What are we looking at here? Well this sets your current Gradle version build level and the testing versions as well. You can modify this to test against specific versions of Gradle. TIP: Keep gradleTestingVersion list short and only test against the version of Gradle you need to. This will reduce your build time. Now open a bash terminal and fire off a gradle build. Oh your code failed with a compilation error because of javaDocs and GroovyDocs? Clean up your code! No seriously, I was shocked the first time I ran it that I had 12 errors all related to java docs. Now these were errors that JaCoCo, Spotbugs, Checkstyle and PMD did not report. One of which was: @param firstName was missing a description. It was helpful to have the line numbers right there on the screen and I was quickly able to resolve my issues. Now that you’ve cleaned up your code do another gradle build. Once your build succeeds navigate to the $repository_name/build/reports/tests folder you will see two sub-folders test and functionalTest. Drill into these folder and open the index.html file to see your test results. Now that you are all setup head over to OpenClover and to the gradle-clover-plugin repository and learn how to tweak the setup for your use. #CS@Worcester #CS-443

From the blog Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

Gradle Clover Part 2

Before we get started there a couple of assumptions I am making: You have a current Gradle project and you want add Code Coverage testing You are using Junit 5 (Jupiter) for testing Ok first and for most backup your existing project. If following my steps causes you to screw something up that’s on you. Especially since my first step in this adventure clearly states backup your existing project! Backup complete? Ok lets move on. Go to github and fork the gradle-clover-plugin repository: https://github.com/bmuschko/gradle-clover-plugin Now open your projects build.gradle and the plugins build.gradle. Wildly different right?! Now merge the two. My project didn’t have any funky dependencies or imports so I literally copied the plugins build.gradle and overwrote mine. There are about nine different plugins used for the Gradle Clover Plugin. Read the projects build.gradle and understand it. This will help assure there are no failures. Remember I did tell you to backup your project before starting. Once you sort out your build.gradle look in the Gradle Clover Plugin repository and open up the gradle folder. Copy the six *.gradle files to your projects gradle folder. These should be: additional-artifacts.gradle documentation.gradle functional-testing.gradle integration-test.gradle publishing.gradle release.gradle Now back to the Gradle Clover Plugin repository copy the src folder to your project and overwrite your src folder. You will now have three folders in your src folder: funcTest main test funcTest contains all the functional testing scripts while main and test contain an additional groovy folder with scripts specific for Groovy. Now if you just overwrote your src directory your code should still be in the java folders under main and test. If not go to your backup and copy them over. This is why we do backups. Now once again back to the Gradle Clover Plugin repository. Copy the gradle.properties file over from the root of the repository to the root of your repository. Open it up with your favorite editor (I like to use Notepad++ ) and check it out. What are we looking at here? Well this sets your current Gradle version build level and the testing versions as well. You can modify this to test against specific versions of Gradle. TIP: Keep gradleTestingVersion list short and only test against the version of Gradle you need to. This will reduce your build time. Now open a bash terminal and fire off a gradle build. Oh your code failed with a compilation error because of javaDocs and GroovyDocs? Clean up your code! No seriously, I was shocked the first time I ran it that I had 12 errors all related to java docs. Now these were errors that JaCoCo, Spotbugs, Checkstyle and PMD did not report. One of which was: @param firstName was missing a description. It was helpful to have the line numbers right there on the screen and I was quickly able to resolve my issues. Now that you’ve cleaned up your code do another gradle build. Once your build succeeds navigate to the $repository_name/build/reports/tests folder you will see two sub-folders test and functionalTest. Drill into these folder and open the index.html file to see your test results. Now that you are all setup head over to OpenClover and to the gradle-clover-plugin repository and learn how to tweak the setup for your use. #CS@Worcester #CS-443

From the blog Home | Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

OpenClover vs. JaCoCo

This is the first in a series of two posts. First we’ll cover the why and the second post will cover the how. Getting OpenClover setup on an existing project took some massaging that will be rather lengthy to explain. OpenClover is based on Atlassian’s Clover product. Atlassian released the source code for Clover and offers OpenClover as a free and open source product. Why OpenClover instead of JaCoCo? Well the code coverage is more complete then JaCoCo and the other alternatives. OpenClover provides plugins for CI servers like Jenkins, Bamboo and Hudson and integrates with ANT, Maven, and Grails. OpenClover also integrates with IntelliJ IDEA and Eclipse. OpenClover covers Java, Groovy and AspectJ while tracking global coverage and per-test coverage. Reports are generated on three kinds of metrics: method coverage, statement coverage, and branch coverage. For a complete list of features see the features page at OpenClover.org I find the ability to integrate OpenClover with my IDE (IntelliJ or Eclipse – I use both) appealing. Being able to run tests from the IDE would be very helpful. I’m not sure if any of the other products available for Code Coverage testing can integrate with an IDE. Follow this link for a general comparison on Code Coverage products. I found it helpful when researching alternatives to JaCoCo. Unlike most reviews/comparisons I have found on line this one includes links to each of the products discussed. Stay tuned for my second post on Implementing OpenClover with an existing Gradle project. #CS@Worcester #CS-443

From the blog Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

OpenClover vs. JaCoCo

This is the first in a series of two posts. First we’ll cover the why and the second post will cover the how. Getting OpenClover setup on an existing project took some massaging that will be rather lengthy to explain. OpenClover is based on Atlassian’s Clover product. Atlassian released the source code for Clover and offers OpenClover as a free and open source product. Why OpenClover instead of JaCoCo? Well the code coverage is more complete then JaCoCo and the other alternatives. OpenClover provides plugins for CI servers like Jenkins, Bamboo and Hudson and integrates with ANT, Maven, and Grails. OpenClover also integrates with IntelliJ IDEA and Eclipse. OpenClover covers Java, Groovy and AspectJ while tracking global coverage and per-test coverage. Reports are generated on three kinds of metrics: method coverage, statement coverage, and branch coverage. For a complete list of features see the features page at OpenClover.org I find the ability to integrate OpenClover with my IDE (IntelliJ or Eclipse – I use both) appealing. Being able to run tests from the IDE would be very helpful. I’m not sure if any of the other products available for Code Coverage testing can integrate with an IDE. Follow this link for a general comparison on Code Coverage products. I found it helpful when researching alternatives to JaCoCo. Unlike most reviews/comparisons I have found on line this one includes links to each of the products discussed. Stay tuned for my second post on Implementing OpenClover with an existing Gradle project. #CS@Worcester #CS-443

From the blog Home | Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

Mockito, not the lovely beverage Mojito

Does your unit test have a stunt double? You know someone to jump in there and take the hit during testing instead of using real dependencies? Well, it could if you used Mockito. I found a great tutorial on setting up Mockito for various environments and situations here: https://www.vogella.com/tutorials/Mockito/article.html But what is Mockito? Don’t confuse it with that lovely mint based rum concoction. It’s a JAVA based mocking framework. It’s used to mock interfaces so that dummy functionality can be used in unit testing. But what is mocking? In OOP mock objects are simulated objects that mimic the behavior of real objects (stunt doubles!). In Test Driven Development mock objects meet interface requirements of real objects allowing developers to write and unit-test functionality. This allows the developers to focus on the behavior of the system while testing without worrying about dependencies. This Martin Fowler article does a great job explaining Mocks: https://martinfowler.com/articles/mocksArentStubs.html So grab a Mojito and enjoy some reading! #CS@Worcester #CS-443

From the blog Home | Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

Gitlab Review Apps deploying to Heroku

**Gitlab Review Apps with Heroku** First a couple of assumptions: you currently have an app setup that deploys from Gitlab to Heroku. You have variables setup in gitlab under CI/CD for your HEROKU_API_KEY and your organization has identified a need to review code before submitting to your QA or pre-production environment. With the help of the gitlab-ci.yml file we can setup a review environment that will automatically build, deploy and tear down a review environment. So how do we accomplish this? Magic! Well, OK maybe not magic but it’s a pretty slick process of using branches and the Heroku API. Ready to take a peak behind the curtain? Let’s get started! **Heroku setup steps:** There aren’t any! All the work to be performed is done within git. The review app (test app or whatever you want to call it) is all handled in the gitlab-ci.yml and by creating a branch of your repository. **Gitlab setup steps:** We need to make a few modifications to the .gitlab-ci.yml. Open the .gitlab-ci.yml in your favorite text editor. I prefer to use DPL to deploy from Gitlab to Heroku. The syntax is easy to follow and much cleaner. Add this near the top of the YAML: before_script: – apt-get update -qy – apt-get install -y ruby-dev curl – gem install dpl This tells gitlab to check for updates, install ruby (needed for dpl) and install dpl. I like to use a variable for passing around the review app name. Add a variable section to the YAML like this: variables: REVIEW_APP_NAME: “$CI_COMMIT_REF_SLUG-$CI_COMMIT_SHORT_SHA” The $CI_COMMIT_REF_SLUG is the name of the branch the commit was performed from and the $CI_COMMIT_SHORT_SHA is the 7 digit commit id. Now we can use REVIEW_APP_NAME wherever we need to. ***Note that using the $CI_COMMIT_SHORT_SHA will allow multiple review apps to run without collision. Now let’s look at the stages section of the YAML. This is what mine looks like: stages: – review – staging – production On Heroku I have both a staging and a production app defined and the push to both is based on the branch I am working in. I also created a pipeline in Heroku that both apps are assigned to. With a click of a button I can promote the app in staging to production. Add the – review to your stages and we’ll setup the start_review and stop_review sections of the YAML next. **Review Magic:** Copy and paste the following into your YAML below the stages section: start_review: stage: review script: – cd ./frontend – echo “$REVIEW_APP_NAME ” $REVIEW_APP_NAME – >- curl -n -X POST https://api.heroku.com/apps -d ‘ { “name”: “‘”$REVIEW_APP_NAME”‘”, “region”: “us” } ‘ -H “Content-Type: application/json” -H “Accept: application/vnd.heroku+json; version=3” -H “Authorization: Bearer $HEROKU_API_KEY” – dpl –provider=heroku –app=$REVIEW_APP_NAME –api-key=$HEROKU_API_KEY environment: name: review/$CI_COMMIT_REF_NAME url: https://$CI_COMMIT_REF_SLUG-$CI_COMMIT_SHORT_SHA.herokuapp.com on_stop: stop_review only: – branches except: – master – staging Let’s review what’s going on here. First we are specifying that this is a review stage. Since I am deploying an Angular app on Heroku and have both a frontend and a backend in my repository (I know nested repositories aren’t best practice but it’s what I have to work with on this project) I first cd into my frontend folder so that the deployment takes place from there. You may not need to do this based on your configuration. I echo the $REVIEW_APP_NAME so that it is in the gitlab logs when the pipeline is run. The curl (Client URL) statement creates the API Post request to create the empty app named $REVIEW_APP_NAME. The – dpl statement performs the gitlab deployment to the newly created Heroku App. The environment settings are for gitlab and define what happens once the deployment completes. The on_stop section calls the stop_review section of the YAML. You will notice I tell gitlab that only branches are used and to exclude the master and staging branches. Now copy this into the YAML: stop_review: stage: review script: – echo “environment is being stopped!” – >- curl -n -X DELETE https://api.heroku.com/apps/$REVIEW_APP_NAME \ -H “Content-Type: application/json” -H “Accept: application/vnd.heroku+json; version=3” -H “Authorization: Bearer $HEROKU_API_KEY” variables: GIT_STRATEGY: none when: manual environment: name: review/$CI_COMMIT_REF_NAME action: stop This is all pretty straight forward. The stop_review only runs on the review stage. The curl statement creates and passes the DELETE API call to delete the app named $REVIEW_APP_NAME. It is processed manually either by stopping the job in the gitlab pipeline or stopping it in the environments section of the project repository. Once you click ‘stop’ the delete command is sent to Heroku and the app is torn down and deleted. Now save the YAML and open up a terminal session. Create a review branch and push it up to the upstream repository: git checkout -b review git add . git commit -m ‘Review App setup’ git push -u origin review Log back into Gitlab and check your repository pipelines. You should see your latest job running. Log into Heroku and check your dashboard. You will see the new deployment popup once Gitlab finishes processing it. After reviewing the app on Heroku when you are ready to tear it down go back in to gitlab and either go to the Operations/Environment section of the repository or the CI-CD/Pipelines section. If you choose to stop the app under Operations/Environments you will see the review app listed there/ On the right hand side of the screen will be a big red stop square. Click the square and answer the popup confirmation prompt to stop the app. Confirm the app has been closed/torn down by checking on Heroku (you may need to refresh your browser to see it drop off). If you choose to stop the app under the CI-CD/Pipeline you will see the pipeline running. Click the running stage under the Stages column and click the stop button to the right of stop_review. Confirm the app has been closed/torn down by checking on Heroku (you may need to refresh your browser to see it drop off). There, now you magically have a review app that get’s served and torn down automatically. #CS@Worcester #CS-448

From the blog Home | Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.

Gitlab Review Apps deploying to Heroku

**Gitlab Review Apps with Heroku** First a couple of assumptions: you currently have an app setup that deploys from Gitlab to Heroku. You have variables setup in gitlab under CI/CD for your HEROKU_API_KEY and your organization has identified a need to review code before submitting to your QA or pre-production environment. With the help of the gitlab-ci.yml file we can setup a review environment that will automatically build, deploy and tear down a review environment. So how do we accomplish this? Magic! Well, OK maybe not magic but it’s a pretty slick process of using branches and the Heroku API. Ready to take a peak behind the curtain? Let’s get started! **Heroku setup steps:** There aren’t any! All the work to be performed is done within git. The review app (test app or whatever you want to call it) is all handled in the gitlab-ci.yml and by creating a branch of your repository. **Gitlab setup steps:** We need to make a few modifications to the .gitlab-ci.yml. Open the .gitlab-ci.yml in your favorite text editor. I prefer to use DPL to deploy from Gitlab to Heroku. The syntax is easy to follow and much cleaner. Add this near the top of the YAML: before_script: – apt-get update -qy – apt-get install -y ruby-dev curl – gem install dpl This tells gitlab to check for updates, install ruby (needed for dpl) and install dpl. I like to use a variable for passing around the review app name. Add a variable section to the YAML like this: variables: REVIEW_APP_NAME: “$CI_COMMIT_REF_SLUG-$CI_COMMIT_SHORT_SHA” The $CI_COMMIT_REF_SLUG is the name of the branch the commit was performed from and the $CI_COMMIT_SHORT_SHA is the 7 digit commit id. Now we can use REVIEW_APP_NAME wherever we need to. ***Note that using the $CI_COMMIT_SHORT_SHA will allow multiple review apps to run without collision. Now let’s look at the stages section of the YAML. This is what mine looks like: stages: – review – staging – production On Heroku I have both a staging and a production app defined and the push to both is based on the branch I am working in. I also created a pipeline in Heroku that both apps are assigned to. With a click of a button I can promote the app in staging to production. Add the – review to your stages and we’ll setup the start_review and stop_review sections of the YAML next. **Review Magic:** Copy and paste the following into your YAML below the stages section: start_review: stage: review script: – cd ./frontend – echo “$REVIEW_APP_NAME ” $REVIEW_APP_NAME – >- curl -n -X POST https://api.heroku.com/apps -d ‘ { “name”: “‘”$REVIEW_APP_NAME”‘”, “region”: “us” } ‘ -H “Content-Type: application/json” -H “Accept: application/vnd.heroku+json; version=3” -H “Authorization: Bearer $HEROKU_API_KEY” – dpl –provider=heroku –app=$REVIEW_APP_NAME –api-key=$HEROKU_API_KEY environment: name: review/$CI_COMMIT_REF_NAME url: https://$CI_COMMIT_REF_SLUG-$CI_COMMIT_SHORT_SHA.herokuapp.com on_stop: stop_review only: – branches except: – master – staging Let’s review what’s going on here. First we are specifying that this is a review stage. Since I am deploying an Angular app on Heroku and have both a frontend and a backend in my repository (I know nested repositories aren’t best practice but it’s what I have to work with on this project) I first cd into my frontend folder so that the deployment takes place from there. You may not need to do this based on your configuration. I echo the $REVIEW_APP_NAME so that it is in the gitlab logs when the pipeline is run. The curl (Client URL) statement creates the API Post request to create the empty app named $REVIEW_APP_NAME. The – dpl statement performs the gitlab deployment to the newly created Heroku App. The environment settings are for gitlab and define what happens once the deployment completes. The on_stop section calls the stop_review section of the YAML. You will notice I tell gitlab that only branches are used and to exclude the master and staging branches. Now copy this into the YAML: stop_review: stage: review script: – echo “environment is being stopped!” – >- curl -n -X DELETE https://api.heroku.com/apps/$REVIEW_APP_NAME \ -H “Content-Type: application/json” -H “Accept: application/vnd.heroku+json; version=3” -H “Authorization: Bearer $HEROKU_API_KEY” variables: GIT_STRATEGY: none when: manual environment: name: review/$CI_COMMIT_REF_NAME action: stop This is all pretty straight forward. The stop_review only runs on the review stage. The curl statement creates and passes the DELETE API call to delete the app named $REVIEW_APP_NAME. It is processed manually either by stopping the job in the gitlab pipeline or stopping it in the environments section of the project repository. Once you click ‘stop’ the delete command is sent to Heroku and the app is torn down and deleted. Now save the YAML and open up a terminal session. Create a review branch and push it up to the upstream repository: git checkout -b review git add . git commit -m ‘Review App setup’ git push -u origin review Log back into Gitlab and check your repository pipelines. You should see your latest job running. Log into Heroku and check your dashboard. You will see the new deployment popup once Gitlab finishes processing it. After reviewing the app on Heroku when you are ready to tear it down go back in to gitlab and either go to the Operations/Environment section of the repository or the CI-CD/Pipelines section. If you choose to stop the app under Operations/Environments you will see the review app listed there/ On the right hand side of the screen will be a big red stop square. Click the square and answer the popup confirmation prompt to stop the app. Confirm the app has been closed/torn down by checking on Heroku (you may need to refresh your browser to see it drop off). If you choose to stop the app under the CI-CD/Pipeline you will see the pipeline running. Click the running stage under the Stages column and click the stop button to the right of stop_review. Confirm the app has been closed/torn down by checking on Heroku (you may need to refresh your browser to see it drop off). There, now you magically have a review app that get’s served and torn down automatically. #CS@Worcester #CS-448

From the blog Michael Duquette by Michael Duquette and used with permission of the author. All other rights reserved by the author.