Call AI Google Calendar Plugin

This article explains how to use Call AI Public Application Programming Interface (API).

The ​​Call AI public API is an application programming interface made publicly available to software developers through GraphQL. It provides programmatic access to the Call AI platform.

Prerequisites

  • Open Authorization (OAuth) access token to authenticate and authorize the request to Call AI public API. For more information, see Generate OAuth Access Token.
  • URL for Call AI public API.

What is GraphQL?

GraphQL is an open-source data query and manipulation language for APIs and a runtime environment for fulfilling queries with existing data. GraphQL provides a comprehensive description of the data in your API. It allows you to define the required data structure and returns the same structure from the server.

You can use GraphQL by providing a query and the description of the data you need. For example, in the screenshot below, you can view the data obtained from the recordings query.

What is a GraphQL Playground?

A GraphQL playground is a graphical, interactive, and in-browser Integrated Development Environment (IDE). You can write queries and mutations directly on the interface. You can also provide input parameters and obtain data based on them.

Steps to use GraphQL

  1. Enter the Call AI public API URL in your browser’s address bar.
  2. The interface that appears has the following four sections:

    1. Query editor: In this section, you can write your query or mutation and the data to obtain from the query.
    2. Output section: In this section, you can view the query results. You can also search and copy data from this section.
    3. Documentation section: In this section, you can check and drill down into the documentation of queries and mutations.
    4. You can also:
    5. Check the input parameters and the data obtained from the parameters.
    6. Identify which parameters are mandatory and their data type.
    7. Identify the return type and included fields of a query or mutation.

  • Header section: In this section, you can add the headers and OAuth access token as shown in the screenshot below. The header is sent with every request to the GraphQL server to authorize the request. For more information, see Generate OAuth Access Token.

 

Supported Queries

The following table provides a description of the supported query, parameters, and fields in Call AI public API:

Query

Description

recordingsV2

Fetches details of all the recordings.

Parameter
(* represents mandatory parameter)

Description

first*

Indicates the number of recordings to fetch.

after*

Indicates the cursor position from where fetching starts (used for paginated results).

filters*

Indicates the filters applied while fetching recordings. It accepts a RecordingsV2Filters object with the following properties:

  • categoryId*: A string denoting the category of recordings to fetch.
    Possible values are:

    1. All recordings on the Call AI platform that you can access (including the recordings of your team members)
    2. Recordings where you were a participant
    3. Recordings shared with you
    4. Recordings bookmarked by you
    5. Recordings where you and your team members participated
    6. Recordings shared by you
  • date: A field to fetch recordings that took place during the given date range. Pass an array with 2 values denoting the start and end time.
  • duration: A field to fetch recordings that have a length greater than the input provided. Pass an array with an integer value denoting the duration in milliseconds.

For more information about RecordingsV2Filters objects, see the playground documentation section on the GraphQL interface.

sort

A field to specify the sorting order of the requested data. Pass an array where each element has the following properties:

  • sortOrder: Possible values are asc or desc
  • sortType: Possible values are:
    • score
    • start_date
    • call_score
    • last_shared

For more information about the input parameters, see playground documentation section on the GraphQL interface.

The following table provides a description of the fields of the node (which is a nested field of edges) returned by the recordingsV2 query:

Field name

Description

id

A unique numerical identifier of the meeting.

title

Title of the meeting as present in the calendar event.

description

Description of the meeting as present in the calendar event.

date

Scheduled date and timestamp of the meeting, for example, 2021-11-02T09:00:00.000Z.

actualStartedAt

Time when the participants joined the meeting. It could be a few minutes before or after the scheduled meeting time.

actualEndedAt

Time when all the participants left the meeting.

duration

Duration of the meeting in seconds, for example, 1385.72.

sharedBy

Name of the user who shared the meeting.

sharedAt

Timestamp when a user shared the meeting.

sharedWithInternal

Names of the internal users (within the organization) with whom you have shared the meeting. It is an array of InternalShareObject where each element represents an internal share and has the following subfields:

  • user: The Userwith whom you have shared the meeting.
  • snippet: The Snippet information if you shared a call snippet. Otherwise, it is null.

sharedWithExternal

Names of the external users (outside the organization) with whom you have shared the meeting. It is an array of ExternalShareUser where each element represents an external share and has the following subfields:

  • id: A unique share identifier.
  • email: The user’s email address with whom you have shared the meeting.
  • externalShareObject: The ExternalShareObject containing details such as:
    • sharedBy
    • allowDownload
    • accessRemoved
    • externalLink

It has the following parameters:

  • showAccessRemoved: Boolean (Default value: false)

speakerEventsV2

A speaker event that triggers whenever a participant speaks. Hence, multiple events can trigger for the same participant throughout the meeting. It is an array of SpeakerEventV2 containing one element per speaker (participant) who caused at least one event during the meeting.

Each element has the following subfields:

  • speaker: The speaker ParticipantV2 who triggered the event.
  • events: Details of the events caused by the speaker. It is an array of SpeakerEvents where each object has startTime (in seconds), endTime (in seconds), and startChunk.
  • time: It is a sum of the duration of all events triggered by a speaker.
  • percentage: Percentage of time for which a speaker talked during the meeting.

participantsV2

Details of all the participants who joined the meeting. It is an array of ParticipantV2 where each element represents a participant of the meeting and has subfields such as:

  • id
  • name
  • email
  • type

It has the following boolean parameters:

  • excludeBots: Specify whether to exclude the Call AI bot from the query response.
  • excludeUnknowns: Specify whether to exclude the unknown participants (whom Call AI cannot identify as internal or external) from the query response.

transcription

Transcript of the meeting. It has the following fields:

  • meetingId
  • chunks: Details of all the transcription chunks of the meeting. It is an array of TranscriptChunk where each element has subfields such as:
    • text
    • startTime
    • endTime
    • speakerEmails
  • languageDirection: Indicates transcript direction. Possible values are ltr (Left To Right) and rtl (Right To Left).

themesV2

Details of all the themes spoken and heard during a meeting. Its value is of type MeetingThemeV2 and has the following subfields:

  • meetingId
  • spoken: Details of themes spoken. It is an array of MeetingThemeDetails where each element shows details of the themes spoken by internal participants (reps).
  • heard: Details of themes heard. It is an array of MeetingThemeDetails where each element shows details of the themes spoken by external participants (customers or prospects).
  • filler: Details of the filler words spoken by internal participants. It is an array of MeetingThemeDetails with a single element.

MeetingThemeDetails has the following subfields:

  • id: A unique theme identifier.
  • themeDetails: An object of type Theme having details such as:
    • id
    • name
    • description
    • createdAt
    • createdBy
    • updatedBy
    • numRecordings
    • numKeywords
    • keywords
    • isKeyTheme
  • keywordOccurrence: An array of KeywordOccurrenceV2 with subfields id, keywordDetails, and keywordCount.

aggregatedThemes

It is similar to themesV2 field and provides an aggregated view of all the themes that were spoken and heard. It is an array of AggregatedThemes where each element has the following subfields:

  • type: THEME_TYPE denoting whether the theme was spoken and heard.
    Possible values: SPOKEN, HEARD, and ALL.
    When the value is ALL, the sibling themes field contains all themes which were spoken and heard.
  • themes: Details of the themes. It is an array of AggregatedTheme where each element has details such as:
    • theme
    • startChunk
    • endChunk
    • startTime
    • endTime

    Here a theme is a nested object containing information such as id, name, keywords.

It has the following parameters:

  • types: Input to specify the themes requested. Pass an array of THEME_TYPE. Each element represents the requested theme.

Possible values of THEME_TYPE:

  • SPOKEN
  • HEARD
  • ALL

For more information about Call AI, see Mindtickle help site documentation.

For more information about GraphQL, see GraphQL documentation.

MSIRobot