Google Assistant Directives

Dialog Flow directives expose google actions functionality that’s platform specific. In general they take the same parameters you would pass to the Actions on Google Node JS SDK.

List

Actions on Google Documentation

The single-select list presents the user with a vertical list of multiple items and allows the user to select a single one. Selecting an item from the list generates a user query (chat bubble) containing the title of the list item.

app.onState('someState', () => {
  return {
    dialogflowList: {
      title: 'List Title',
      items: {
        // Add the first item to the list
        [SELECTION_KEY_ONE]: {
          synonyms: [
            'synonym of title 1',
            'synonym of title 2',
            'synonym of title 3',
          ],
          title: 'Title of First List Item',
          description: 'This is a description of a list item.',
          image: new Image({
            url: IMG_URL_AOG,
            alt: 'Image alternate text',
          }),
        },
        // Add the second item to the list
        [SELECTION_KEY_GOOGLE_HOME]: {
          synonyms: [
            'Google Home Assistant',
            'Assistant on the Google Home',
        ],
          title: 'Google Home',
          description: 'Google Home is a voice-activated speaker powered by ' +
            'the Google Assistant.',
          image: new Image({
            url: IMG_URL_GOOGLE_HOME,
            alt: 'Google Home',
          }),
        },
        // Add the third item to the list
        [SELECTION_KEY_GOOGLE_PIXEL]: {
          synonyms: [
            'Google Pixel XL',
            'Pixel',
            'Pixel XL',
          ],
          title: 'Google Pixel',
          description: 'Pixel. Phone by Google.',
          image: new Image({
            url: IMG_URL_GOOGLE_PIXEL,
            alt: 'Google Pixel',
          }),
        },
      },
    }
  }
});

Suggestions

Actions on Google Documentation

Use suggestion chips to hint at responses to continue or pivot the conversation. If during the conversation there is a primary call for action, consider listing that as the first suggestion chip.

Whenever possible, you should incorporate one key suggestion as part of the chat bubble, but do so only if the response or chat conversation feels natural.

app.onState('someState', () => {
  return {
    dialogflowSuggestions: ['Exit', 'Continue']
  }
});
app.onState('someState', () => {
  return {
    dialogflowLinkOutSuggestion: {
      name: "Suggestion Link",
      url: 'https://assistant.google.com/',
    }
  }
});

BasicCard

Actions on Google Documentation

A basic card displays information that can include the following:

  • Image
  • Title
  • Sub-title
  • Text body
  • Link button
  • Border

Use basic cards mainly for display purposes. They are designed to be concise, to present key (or summary) information to users, and to allow users to learn more if you choose (using a weblink).

In most situations, you should add suggestion chips below the cards to continue or pivot the conversation.

Avoid repeating the information presented in the card in the chat bubble at all costs.

app.onState('someState', () => {
  return {
    dialogflowBasicCard: {
      text: `This is a basic card.  Text in a basic card can include "quotes" and
      most other unicode characters including emoji.  Basic cards also support
      some markdown formatting like *emphasis* or _italics_, **strong** or
      __bold__, and ***bold itallic*** or ___strong emphasis___ `,
      subtitle: 'This is a subtitle',
      title: 'Title: this is a title',
      buttons: new Button({
        title: 'This is a button',
        url: 'https://assistant.google.com/',
      }),
      image: new Image({
        url: 'https://example.com/image.png',
        alt: 'Image alternate text',
      }),
    }
  }
});

AccountLinkingCard

Actions on Google Documentation

Account linking is a great way to lets users connect their Google accounts to existing accounts on your service. This allows you to build richer experiences for your users that take advantage of the data they already have in their account on your service. Whether it’s food preferences, existing payment accounts, music preferences, your users should be able to have better experiences in the Google Assistant by linking their accounts.

app.onState('someState', () => {
  return {
    dialogflowAccountLinkingCard: "To track your exercise"
  }
});

MediaResponse

Actions on Google Documentation

Media responses let your app play audio content with a playback duration longer than the 120-second limit of SSML. The primary component of a media response is the single-track card. The card allows the user to perform these operations:

  • Replay the last 10 seconds.
  • Skip forward for 30 seconds.
  • View the total length of the media content.
  • View a progress indicator for audio playback.
  • View the elapsed playback time.
const { MediaObject } = require('actions-on-google');

app.onState('someState', () => {

  const mediaObject = new MediaObject({
    name,
    url,
  });

  return {
    dialogflowMediaResponse: mediaObject
  };
});

User Information

Actions on Google Documentation

User information You can obtain the following user information with this helper:

  • Display name
  • Given name
  • Family name
  • Coarse device location (zip code and city)
  • Precise device location (coordinates and street address)
app.onState('someState', () => {
  return {
    dialogflowPermission: {
      context: 'To read your mind',
      permissions: 'NAME',
    }
  };
});

Date and Time

Actions on Google Documentation <https://developers.google.com/actions/assistant/helpers#date_and_time>

You can obtain a date and time from users by requesting fulfillment of the actions.intent.DATETIME intent.

app.onState('someState', () => {
  return {
    dialogflowDateTime: {
      prompts: {
        initial: 'When do you want to come in?',
        date: 'Which date works best for you?',
        time: 'What time of day works best for you?',
      }
    }
  };
});

Confirmation

Actions on Google Documentation <https://developers.google.com/actions/assistant/helpers#confirmation>

You can ask a generic confirmation from the user (yes/no question) and get the resulting answer. The grammar for “yes” and “no” naturally expands to things like “Yea” or “Nope”, making it usable in many situations.

app.onState('someState', () => {
  return {
    dialogflowConfirmation: 'Can you confirm?',
  };
});

Place and Location

Actions on Google Documentation

You can obtain a location from users by requesting fulfillment of the actions.intent.PLACE intent. This helper is used to prompt the user for addresses and other locations, including any home/work/contact locations that they’ve saved with Google.

Saved locations will only return the address, not the associated mapping (e.g. “123 Main St” as opposed to “HOME = 123 Main St”).

app.onState('someState', () => {
  return {
    dialogflowPlace: {
      context: 'To find a place to pick you up',
      prompt: 'Where would you like to be picked up?',
    }
  };
});

Digital Goods

Actions on Google Documentation

You can add dialog to your Action that sells your in-app products in the Google Play store, using the digital purchases API.

You can use the google.digitalGoods object to get the subscriptions and InAppEntitlements filtered by the skuIds you pass to the function. Voxa handles all operations in background to get access to your digital goods in the Play Store. To do that, you need to pass to the GoogleAssistantPlatform object, the packageName of your Android application along with the keyFile with the credentials you created in your Google Cloud project.

TransactionDecision

TransactionRequirements

Routine Suggestions

Actions on Google Documentation

To consistently re-engage with users, you need to become a part of their daily habits. Google Assistant users can already use Routines to execute multiple Actions with a single command, perfect for those times when users wake up in the morning, head out of the house, get ready for bed or many of the other tasks we perform throughout the day. Now, with Routine Suggestions, after someone engages with your Action, you can prompt them to add your Action to their Routines with just a couple of taps.

app.onState('someState', () => {
  return {
    dialogflowRegisterUpdate: {
      intent: 'Show Image',
      frequency: 'ROUTINES'
    }
  };
});

Push notifications

Actions on Google Documentation

Your app can send push notifications to users whenever relevant, such as sending a reminder when the due date for a task is near.

app.onState('someState', () => {
  return {
    dialogflowUpdatePermission: {
      intent: 'tell_latest_tip'
    }
  };
});

Multi-surface conversations

Actions on Google Documentation

At any point during your app’s flow, you can check if the user has any other surfaces with a specific capability. If another surface with the requested capability is available, you can then transfer the current conversation over to that new surface.

app.onIntent('someState', async (voxaEvent) => {
  const screen = 'actions.capability.SCREEN_OUTPUT';
  if (!_.includes(voxaEvent.supportedInterfaces, screen)) {
    const screenAvailable = voxaEvent.conv.available.surfaces.capabilities.has(screen);

    const context = 'Sure, I have some sample images for you.';
    const notification = 'Sample Images';
    const capabilities = ['actions.capability.SCREEN_OUTPUT'];

    if (screenAvailable) {
      return {
        sayp: 'Hello',
        to: 'entry',
        flow: 'yield',
        dialogflowNewSurface: {
          context, notification, capabilities,
        },
      };
    }

    return {
      sayp: 'Does not have a screen',
      flow: 'terminate',
    };
  }

  return {
    sayp: 'Already has a screen',
    flow: 'terminate',
  };
});

Output Contexts

Actions on Google Documentation

If you need to add output contexts to the dialog flow webhook you can use the dialogflowContext directive

app.onIntent("LaunchIntent", {
  dialogflowContext: {
    lifespan: 5,
    name: "DONE_YES_NO_CONTEXT",
  },
  sayp: "Hello!",
  to: "entry",
  flow: "yield",
});

Session Entities

Google Documentation

A session represents a conversation between a Dialogflow agent and an end-user. You can create special entities, called session entities during a session. Session entities can extend or replace custom entity types and only exist during the session that they were created for. All session data, including session entities, is stored by Dialogflow for 20 minutes.

For example, if your agent has a @fruit entity type that includes “pear” and “grape”, that entity type could be updated to include “apple” or “orange”, depending on the information your agent collects from the end-user. The updated entity type would have the “apple” or “orange” entity entry for the rest of the session.

Create an entity in your dialogflow agent and make sure that define synonyms option is checked. Add some values and synonyms if needed according agent instructions. Notice that the name of the entity is the value to be used by the directive (i.e., list-of-fruits).

// variables.js

  export function mySessionEntity(voxaEvent: VoxaEvent) {
    // Do something with the voxaEvent, or not...

    const sessionEntityType = [
      {
        "entities": [
          {
            "synonyms": ["apple", "green apple", "crabapple"],
            "value": "APPLE_KEY"
          },
          {
            "synonyms": ["orange"],
            "value": "ORANGE_KEY"
          }
        ],
        "entityOverrideMode": "ENTITY_OVERRIDE_MODE_OVERRIDE",
        "name": "list-of-fruits"
      }
    ];

    return sessionEntityType;
  }

// views.js

  const views = {
    "en-US": {
      translation: {
        mySessionEntity: "{mySessionEntity}"
      },
    };
  };

// state.js

  app.onState('someState', {
    dialogflowSessionEntity: "mySessionEntity",
    flow: "yield",
    sayp: "Hello!",
    to: "entry",
  });

// Or you can do it directly...

  app.onState('someState', {
    dialogflowSessionEntity:  [
      {
        "entities": [
          {
            "synonyms": ["apple", "green apple", "crabapple"],
            "value": "APPLE_KEY"
          },
          {
            "synonyms": ["orange"],
            "value": "ORANGE_KEY"
          }
        ],
        "entityOverrideMode": "ENTITY_OVERRIDE_MODE_OVERRIDE",
        "name": "list-of-fruits"
      }
    ],
    flow: "yield",
    sayp: "Hello!",
    to: "entry",
  });

Entity Directive

The Entity Directive allows to create Session Entities based on a generic structure.

Entities are sent with the entities key in your controller. You need to pass a view name with the types array or object.

If the entityOverrideMode is ommited, the ENTITY_OVERRIDE_MODE_OVERRIDE value is used by default.

The value in googleEntityName should be the same as the one used for the Entities in the Dialogflow agent.

// variables.js

   export function myEntity(voxaEvent: VoxaEvent) {
     // Do something with the voxaEvent, or not...

     const entityType = [
       {
         "entities": [
           {
             "synonyms": ["apple", "green apple", "crabapple"],
             "value": "APPLE_KEY"
           },
           {
             "synonyms": ["orange"],
             "value": "ORANGE_KEY"
           }
         ],
         "googleEntityName": "list-of-animals"
       }
     ];

     return entityType;
   }

 // views.js

   const views = {
     "en-US": {
       translation: {
         myEntity: "{myEntity}"
       },
     };
   };

 // state.js

   app.onState('someState', {
     entities: "myEntity",
     flow: "yield",
     sayp: "Hello!",
     to: "entry",
   });

// Or you can do it directly…

app.onState(‘someState’, {
entities: [
{
“entities”: [
{
“synonyms”: [“apple”, “green apple”, “crabapple”], “value”: “APPLE_KEY”

}, {

“synonyms”: [“orange”], “value”: “ORANGE_KEY”

}

], “googleEntityName”: “list-of-animals”

}

], flow: “yield”, sayp: “Hello!”, to: “entry”,

});