Uncategorized

Deploying and compiling a typescript Node.JS project to Azure

Firstly, let me say that I love Azure and especially the simplicity that it provides when deploying projects directly from GitHub – I absolutely love the ability to check into the master branch of a project and it automatically deploy to the Azure Web App service that I have it configured on… usually.

Typescript is not automatically compiled on deployment.

When developing applications using Typescript (a good idea for any node project) you would be forgiven for assuming that Azure Web Apps would have the built in ability to recognise that you are using Typescript and automatically run the TSC (typescript compiler) after a deploy… but you would be wrong.

I have spent the last few hours trying to get this sorted out and alas I am successful so I am sharing the solution with you so that you can save yourself a little time and once again enjoy continuous integration and deployment with Azure and your Typescript projects.

Azure CLI

The first thing we need to do is install the Azure CLI globally using NPM, its dead simple and is done using the following command…

npm install azure-cli -g

Once that is done we need to login to the Azure CLI using your Azure credentials (again this dead simple and is done with the following command… (Follow the instructions in the command window and load up the url provided in a browser, entering the code).

azure login

Great, once that is all done and dusted we need to change the config mode, first change directory the root of your node project and run the following command…

azure config mode asm

Once done, we want to use the Azure CLI to generate some deploy scripts for us, these deploy scripts are run after each deployment to Azure… run the following command to generate them…

azure site deploymentscript --node

Great! that generates two files… .deployment and deploy.sh, we want to edit the deploy.sh file to add a few things, basically we want to add a command to the end of the file that tells Azure to run the TSC (Typescript Compiler) for our project and therefore generating all our .js and .js.map files.

In the deploy.sh file locate the following code…

# 3. Install npm packages

if [ -e "$DEPLOYMENT_TARGET/package.json" ]; then

  cd "$DEPLOYMENT_TARGET"

  eval $NPM_CMD install --production

  exitWithMessageOnError "npm failed"

  cd - > /dev/null

fi

Immediately after add the following…

# 4. Install typescript definitions

cd "$DEPLOYMENT_TARGET"

#4. Run TSC

TSC="$DEPLOYMENT_TARGET/node_modules/typescript/bin/tsc"

echo "Building Typescript files."

"$TSC"

exitWithMessageOnError "Could not run 'tsc'.  Did 'npm install' run OK?"

echo "Finished Typescript build."
Now all we need to do is commit our changes (and the new deployment files) to Azure and watch how it automatically compiles our project for us 🙂

Ensure typescript is added to project dependencies

It is worth noting that in order for this to work, we need to make sure that Typescript is listed as a project dependency (not always the case if you have installed it globally). Lets do that by running this command from the root of your project…

npm install typescript --save

This should be easier.

I truly believe that as Typescript is a Microsoft developed superset of Javascript that this compilation should be automatic, I would be unsurprised if Microsoft added this feature later to Azure but for now this should get you going.

Enjoy.

Uncategorized

My guide to building stable UX experiences for chat bot’s.

Chat bot’s are now becoming more popular however it is very easy to ruin a users experience using your bot if you don’t consider a number of important points when developing once.

As I have written more of them and watched the usage, I have developed a kind of list of do’s and don’ts when developing your bot and I thought today I would share them with you, enjoy!

Provide a help command to support the user.

You should always ensure that the user can ask for help whenever they are stuck or having problems, the help command should provide a guide to how your bot should be used as well as listing the ways in which the bot was intended to be used.

Further, it is a great idea to ensure that the help menu is available from the main menu.

Set expectations of execution time.

If the bot needs to call an external API or run a long process before responding to the user, make sure they are aware of this and how long they can expect to wait. Ensure the user is not in a situation where the bot appears idle, the user will likely start issuing repeat demands and this can cause issues.

If supported, the “typing” event can really help here, it makes the bot seem more natural, almost as if it is pondering it’s answer.

Include a greeting and a walkthrough for first time users.

First time the user has used the bot? Greet the user and provide a little information about how they can get the most from the bot, give them some details about what the bot’s purpose is and perhaps some example commands to help them get started.

Avoid asking for information the user has already shared.

Remember to take note of important information that the user may have shared in previous messages or dialog’s, if the user has supplied their address before then save that address for later use, it makes the bot seem much more intelligent, it provides a great experience too.

“Would you like me to deliver to the same address you mentioned earlier? 1 Appletree Lane?”

Avoid sending plain URL links instead of buttons.

When sending links to external websites, it’s much nicer to wrap that link in a button as opposed to simply sending a http:// url as a plain text message.

Include a welcome menu.

You should include a welcome screen with an easy to use menu which includes buttons that can start the main capabilities of the bot, it is always nice to include a friendly message that also points to the main features.

Use several distinct variations of each reply including for your standard failure message.

Each response should have variations of how it responds, variety in the responses to the user will make the bot much nicer to interact with and gives the impression of more human like behavior.

Give the ability for users to add feedback.

I like to do this at the end of the bot’s process when the conversation has finished, when the bot say’s goodbye to the user take the opportunity to ask if they would care to leave a little feedback about their experience. This information and metric is extremely valuable.

Ensure the bot has a minimum personality.

Every bot should have a distinct personality, take it as an opportunity to think about  what kind of personality best reflects your brand and remember this when crafting your bot’s responses.

Avoid low quality images and graphics.

Bad images and graphics can really kill a good experience and ruin the professional image your bot should be portraying.

Greet the user by name.

Usually the name of the user is available to you when a conversation is started, make the bot seem more personal and greet the user by their name.

Don’t try and solve the turing test.

Sometimes, having the bot able to some additional requests that fall outside it’s main purpose can inject some fun into the bot however don’t be afraid to guide the user as much as possible, narrowing the scope, if the user asks a question or makes a request outside the scope of the bot don’t be afraid to simply tell the user, you can even make this nicer by adding some personality.

Hey, I didn’t quite get what you meant by that… I am a bot after all. If you need help with what I can do for you, just ask.

Uncategorized

Introducing Botbuilder-QuickReplies for Microsoft Bot Builder (Node)

Today I released botbuilder-quickreplies, an open source node package that can used to simplify support for Facebook Messenger’s Quick Reply functionality when building chat bots using Microsoft Bot Framework.

What are Quick Replies?

quickreplies2-ea958f6b

Quick Replies are a Facebook Messenger feature that allow chat bots to supply a set of buttons that correspond to common replies that the user can tap instead of typing out their response, it’s much quicker and can greatly simplify the bot UX.

Currently however, Microsoft Bot Builder does not natively support them without adding custom JSON payloads of the response messages that are returned from the bot, also because quick replies are only supported by Facebook there is some manual checking to ensure that the conversation channel is a Facebook one, all things we want to avoid if possible.

 

Quick Replies are not sent back as standard replies

The other problem with supporting quick replies is that when tapped, the response that is sent back to the bot does not contain the message text corresponding to the quick reply button that was tapped by default, instead it is sent back as a sourceEvent payload that needs to be inspected and fetched should it exist, when using recognizers with your bot like LUIS this can also be a problem and a bit of a pain to deal with.

This can be fixed with some custom middle ware that inspects each request to the bot for the sourceEvent and replacing the message.text property with the payload before LUIS is fired and starts to recognize the intent.

 

BotBuilder-QuickReplies to the rescue…

Here is how all this is solved with my new package…

Simply install the package into your project…

npm install botbuilder-quickreplies --save

Add the middleware to the bot before any recognisers or additional middleware is added.

var quickReplies = require('botbuilder-quickreplies');
var bot = new builder.UniversalBot(connector, function (session) {
    session.send("Hello I am a chatbot.");
});

// Set the middleware.
bot.use(quickReplys.QuickRepliesMiddleware);

And to add quick replies simply make use of the utility classes that I have added for your benefit…

// Create a message with some text.
var message = new builder.Message(session).text('Some text');

// Add some quick replies.
message = quickReplies.AddQuickReplies(session, message, [
    new quickReplies.QuickReply(session, 'This is my title', 'This is my message'),
    new quickReplies.QuickReply(session, 'This is another title', 'This is other message', 'https://upload.wikimedia.org/wikipedia/commons/thumb/3/3f/Button_Icon_Blue.svg/768px-Button_Icon_Blue.svg.png') // with optional image
]);

// Send the message.
session.send(message);

That is all there is too it, the package will check to ensure the channel is Facebook before adding quick replies or parsing their responses so no additional checks are needed.

I hope you enjoy the package.

NPM: https://www.npmjs.com/package/botbuilder-quickreplies

GitHub: https://github.com/benjaminpaul/botbuilder-facebook-quickreplies

 

Uncategorized

Announcing the JUST EAT Xbox One Beta.

Well, today was a good day. We publically released the official public beta of JUST EAT for Xbox One! If you would like to try it out, we have a limited number of codes that you can use to redeem it on the Windows Xbox Store right now!

How To Get The Just Eat Xbox One Beta:

1) On your Xbox One, make sure you are signed in to your Microsoft account.

2) From the Home screen, scroll right to Store.

3) Under Games, select Use a code.

4) Enter your personal redemption code: GXRR9-XWYPV-9D2CF-YCPMD-WRY9Z

 

We will be sending a survey to all those who take part early next week to get your feedback on the experience and to help us make an even better experience for gamers.

Thanks Guys – Enjoy the Beta!

 

 

 

Load Testing

Handling ASP.NET MVC’s Anti-Forgery Tokens when load testing with JMeter

When building web applications that are intended to scale it’s a really good idea to stress test them to ensure that they can handle the load expected of them.

JMeter is a popular application that takes care of building test cases for stress testing, it makes HTTP calls that emulate real traffic to your site and can simulate things like form submission and login among other things.

Handling ASP.NET’s AntiForgeryToken

To guard against cross site request forgery I use ASP’s [AntiForgeryToken] filter on the controller action and @Html.AntiForgeryToken() extension method in the view. This means that if a POST request does not include or match the correct token the request will fail.

When attempting to login to the website you are testing you will need to make a GET request to the login action which will generate a new token, we will need to then extract that token in JMeter using a “Regular Expression Extractor” and store it in a variable so that we can then use it when to make the POST request to the server including our credentials.

Here is what my “Regular Expression Extractor” looks like in JMeter:

screen-shot-2016-11-23-at-18-58-31

Some key points to note:

  • The Reference Name field names the variable that we will then reference in the POST request we make to login.
  • The Regular Expression field tells JMeter that we want to extract the value from the string using:
name="__RequestVerificationToken" type="hidden" value="([A-Za-z0-9+=/\-\_]+?)"

 

Now that we have the token placed within the antiForgery variable we can reference it when we make the POST request like so:

screen-shot-2016-11-23-at-19-14-50

As you can see from the screenshot we are making a POST to our login controller passing in the Username and Password variables as well as the __RequestVerificationToken, the value is set to ${antiForgery}, the ${variable} syntax is used within JMeter when you want to provide a value using a variable.

Uncategorized

Conversations as a platform & Microsoft Bot Framework – Slides and Code

Thanks to everyone who came along to my talk on the 20th September at JUST EAT it was a great turnout and I met some awesome dev’s, the talk seemed to go down well and the HoloLens was very popular too!

On the day we talked about what bot’s are (and what they are not) and I also introduced LUIS (Language Understanding Intelligence Service) as well as taking our chances and building a bot live on stage that could tell us when the next .NET South West event was being held, if one of your friends was attending and also joined everything up with the Amazon Echo so that Alexa could ask our bot who the raffle winner was!

All the code and slides (as well as the LUIS model for importing) can be found on my github page at the following URL:

https://github.com/benjaminpaul/dotnetsouthwest-bot

Thank you for coming along, see you at the next one!

Uncategorized

Talking: Conversations as a platform & Microsoft Bot Framework – September 20th 2016 (Bristol)

Well, it is almost time for my first talk at the mighty .NET South West user group which is held at the JUST EAT offices here in Bristol each month and I am pretty excited to talk about something that I have been working with very closely over past few months!

I have recently been working with Microsoft at their Redmond campus as well as Skype in London & California to get to grips with some of the concepts behind building chat bots that can be interacted with on various platforms such as Skype, Email, SMS, KIK and Facebook Messenger and I am super excited to talk about some of the things I have learned regarding the Microsoft Bot Framework & “conversations as a platform” in general; As I am sure you have noticed, bot’s have become a bit of a buzz word lately so lets demystify them a little and look what what we have available to us as .NET developers so that we can build our own.

Building a bot live

During the talk I intend to build a bot from scratch, from creating the solution to grabbing the required libraries from NUGET all the way to publishing to Azure and registering your bot with the bot connector! Along the way of course I will explain a little about the classes I use such as Dialog and LuisDialog as well as demonstrating how we can use machine learning to create bots that understand language and greatly improve the experience we have with them.

To end the talk we will have the bot randomly select a raffle winner where we can dish out some lovely developer swag to some lucky winners (Wahhoooo!).

That’s not all… Fancy a go on the HoloLens?

As if the joy of coming to see me talk was not quite enough (ah hem), I will also have a Microsoft HoloLens available for people to play with and experience – Time will be limited so make sure you get there early to avoid disappointment.

Sign up for free here!

 

I hope you can make it, looking forward to seeing you there.

Uncategorized

Innovating Bristol on the 8th October with Bristol City Council & JUST EAT.

What happens when you take some of the country’s most talented developers and a heap of real time transport data from your local city? Awesomeness, that’s what.

A few months ago I happened to stumble on an advert posted on twitter by my local council asking for developers to participate in a hackathon that aimed to solve some of the cities transport struggles and in exchange the winners could expect £100 for their troubles (as well as the kudos of making everyone’s commute that much easier of course).

Announcing new sponsorship from JUST EAT.

I am pleased to say that after a few phone calls and emails JUST EAT are now sponsoring the event by offering use of our fantastic offices located in central Bristol as well as an amazing breakfast, free pizza and best of all – free beer!

As well as all of the free food and refreshments we will also be making sure that our fantastic team of engineers, UI and UX experts are on hand to help you with your projects and supplying lots of amazing new tech for everyone to try and have fun with (HoloLens anyone?).

Free beer? Free Food? Sign me up!

If you like the idea of joining in and coming down to join us all you have to do is sign up at EventBrite (It’s completely free) and join us with a laptop on the day – You are welcome to come as a team or individual and you don’t even need to know how to code! (Although it does help).

SIGN UP HERE

I will be posting more information over the coming week but I think it is safe to say we are all super excited about what we can create on the 8th. See you there.