Guides

Edit Scheduled Tasks

In this page we will learn how to update a scheduled task. From the list of scheduled tasks, click in the tasks you wish to edit:

Data schedules edit record 0c33970cdc

In this page you will be able to change the name for this task:

Data schedules create name 7604e1d302

You can also change the frequency at which it should run. If you choose hourly, you will have to provide the minute of each hour that it will run:

Data schedules frequency hourly a407dfc552

If you choose daily, you will have to provide the hour and minute of each day that it will run:

Data schedules frequency daily 980d5e53f0

Then you can always change the source of your data. You can choose between Amazon's S3 and any FTP or SFTP server.

Amazon Web Services S3

To use a AWS S3 bucket select Amazon Web Services S3 from the Storage selector:

Data schedules storage s3 03122a1a02

Then you will need to provide the bucket name:

Data schedules storage s3 bucket 76517079f4

You will also have to provide the AWS Key:

Data schedules storage s3 key d7630aeb11

And the AWS Secret:

Data schedules storage s3 secret eadce85f88

You will also have to provide the region where your S3 bucket is located:

Data schedules storage s3 region 317df0ef02

And finally you need to provide the path to the file:

Data schedules storage s3 path e634e7f67c

Microsoft Azure Blob

To use a Microsoft Azure Blob Storage container select Microsoft Azure Blob from the Storage selector:

Data schedules storage blob c615ee1cb4

Then you will need to provide the account name:

Data schedules storage blob account 1204a7c822

After that you will need to provide the container name:

Data schedules storage blob container dd1b0caa89

You will also have to provide the Azure Blob Storage access key:

Data schedules storage blob key 5e707f9fbe

And finally you need to provide the path to the file:

Data schedules storage s3 path e634e7f67c

FTP Server

If you would like to use a FTP server, please select FTP (File Transfer Protocol) from the Storage selector:

Data schedules storage ftp 44f1c8fb26

For an FTP server you must provide the host name and port number:

Data schedules storage ftp host port d2948a4f42

You must also provide the username:

Data schedules storage ftp username d8cb066c94

And the password:

Data schedules storage ftp password cdd6aaff9f

If you server supports TLS (recommended), please check the following box:

Data schedules storage ftp tls 237c88309a

And if it uses a self-signed certificate, please check the following box:

Data schedules storage ftp certificate 71161265d8

And finally you need to provide the path to the file:

Data schedules storage s3 path e634e7f67c

SFTP Server

If you would like to use a SFTP server, please select SFTP (SSH File Transfer Protocol) from the Storage selector:

Data schedules storage sftp 269b4fefd2

For this type of storage, you have two authentication options. If you use a password, please select Using a Password from the Authentication Type selector:

Data schedules storage sftp password bbff58c8c7

For this authentication type you must provide the host name and port number:

Data schedules storage ftp host port d2948a4f42

You must also provide the username:

Data schedules storage ftp username d8cb066c94

And the password:

Data schedules storage ftp password cdd6aaff9f

If you use a private key instead, please select Using a Private Key from the Authentication Type selector:

Data schedules storage sftp private key a8d92966f2

For this authentication type you must provide the host name and port number:

Data schedules storage ftp host port d2948a4f42

You must also provide the username:

Data schedules storage ftp username d8cb066c94

You should also provide the private key:

Data schedules storage sftp private key key d17151353e

And optionally, a passphrase (if applicable):

Data schedules storage sftp passphrase 41a4154184

And finally you need to provide the path to the file:

Data schedules storage s3 path e634e7f67c

Import Devices

To import devices first it's important you get your device tokens from a reliable source (like your old push provider). If you have a valid set of device tokens you wish to import, go ahead and select Import Devices from the Job Type selector:

Data import devices job type 277a1617a1

Optionally if you wish to overwrite records if they already exist, check the following box:

Data import devices overwrite e15d90f7e3

Optionally if you want to force all records in your import file to have the platform, OS version and app version you simply provide those in the following fields:

Data import devices columns 33449a03a2

These values can also be included in your import file. If you do, leave the previous fields unchecked.

You can also associate your devices with tags while importing them, to do that check the box below:

Data import devices include tags 62bdf648a6

Optionally, if you check this option, you can also provide which separator you use in between each tag. By default we will look for a vertical bar (a.k.a pipe) if you use something else you should provide it in the following field:

Data import devices tag separator 1d51eefda8

Below you find the .csv file example for this type of import:

1
2
3
4
  deviceID,osVersion,appVersion,platform,language,userID
  "b489bc0c0c4420dc6b09f54270548292a102ac9","8.0","1.1","iOS","nl","test123@example.com"
  "c0c4420dc6b09f54202ac83385f9f0a150dc6b0","8.1","1.1","iOS","en","test456@example.com"
  "9f54270548292a102ac93b07e952aeb283385f9","7.1","1.0","iOS","pt","test789@example.com"

Import Users

If you have imported devices with an unique identifier (userID), you will also want to import users. This will create a user profile which can have an unlimited number of devices. Go ahead and select Import Users from the Job Type selector:

Data import users job type adeb8e801d

Below you find the .csv file example for this type of import:

1
2
3
4
  userID,userName
  "test123@example.com","Test User 1"
  "test123@example.com","Test User 2"
  "test123@example.com","Test User 2"

Import Users into a Segment

With this job you can quickly assign existing users into a segment. This might be extremely useful if you want to keep user segmentation in sync with other software like a CRM. Go ahead and select Import Users into a Segment from the Job Type selector:

Data import segments job type 9d6ea0e4ee

Obviously you will have to also select the segment you want to import users into:

Data import segments segment 9c6ff0aac2

To do data search for existing segments as shown below:

Data import segments search segment e094681a7b

And select the segment you'll use in the import job:

Data import segments selected segment f4585fd73d

Optionally if you wish to remove all users currently in the segment before importing new data, check the following box:

Data import segments remove all 5b173abae2

After importing new data, if you want to send a message to this segment, you can check the option below:

Data import segments send push 1b7c6f750c

If you check this option you will have to provide the message template we should use to send your message:

Data import segments template 6de0b37770

You do that by searching for an existing template like shown below:

Data import segments search template 909600f490

And selecting the one you want to use as the base for your message:

Data import segments selected template 7f4e5f234f

Below you find the .csv file example for this type of import:

1
2
3
4
  userID
  "user1@email.com"
  "user2@email.com"
  "user3@email.com"

Import Regions

If your app will make use of location services like geo-fencing and you have the need to create considerable amounts of locations then this job will help you. To import regions simply select Import Regions from the Job Type selector:

Data import regions job type 36e990c07a

Below you find the .csv file example for this type of import:

1
2
3
4
  name,major,latitude,longitude,distance,timezone
  "Amsterdam Shop","12345","52.357079","4.929666","1000","Europe/Amsterdam"
  "New York Shop","12346","40.7058316","-74.2581936","1000","America/New_York"
  "Tokyo Shop","12347","35.6742958","139.574921","1000","Asia/Tokyo"

Import Beacons

After importing regions and you also have BTLE beacons in your locations, you will want to also import that data. To do that, select Import Beacons from the Job Type selector:

Data import beacons job type e69a523806

Because in most cases, beacons are managed in other systems and their configuration data might change, there could be times where you just need to update existing records, to do that if you wish to updated existing beacons with new data, check the following box:

Data import beacons update 93685e359b

There is also cases where you will want to simply import beacon data and quickly create the regions where they will be inserted. If that is the case you can check the box below:

Data import beacons create regions b8d4f84e46

If you check this option you will also want to provide a timezone and radius for those regions:

Data import beacons region data 8387dd037b

Below you find the .csv file example for this type of import:

1
2
3
4
  name,major,minor,latitude,longitude,timezone
  "Entrance","12345","1001","52.357079","4.929667","Europe/Amsterdam"
  "POS","12345","1002","52.357078","4.929668","Europe/Amsterdam"
  "Milk Aisle","12345","1003","52.357077","4.929669","Europe/Amsterdam"

Batch Passes Import

If you've subscribed the Loyalty add-on and have the need to generate passes with data from another sources, your best option is this import job. To create a bulk pass generation, select Passes Batch Import from the Job Type selector:

Data import passes job type b815236945

You will need to select an existing pass template as the starting point for this import job:

Data import passes template c94cd452c5

Search for an existing pass template like shown below:

Data import passes search template 2388fc6478

And select the template you want to use:

Data import passes selected template 41354ccc5a

Optionally you can also override the description of your template:

Data import passes template description d51462d541

Depending on the template you choose, we will show the available fields that you can provide default values to. You can leave these empty if you already provide them in the import file:

Data import passes default values 175960b77a

We will also give you the chance to distribute the passes via a push notification right after the import job is completed. For that check the following box:

Data import passes send push 9f474db353

If you check this option you can also provide the text we should send in your push notification:

Data import passes message eb26d6c719

If you wish to personalize this message for each pass, you can leave the previous field blank and instead include it in the import file.

Below you find the .csv file example for this type of import:

1
2
3
  userID,pass_barcode,pass_boarding,pass_origin,pass_destination,pass_flight,pass_gate,pass_seat,pass_location_region_major,pass_relevant_date,pass_expiration_date,test_field
  joris@notifica.re,12345,930P,AMS,MSY,KL1234,D2,42A,2,2017-08-29T21:00:00,2017-08-30T00:00:00,"Nice test"
  joel@notifica.re,12346,930P,AMS,MSY,KL1234,D2,42A,2,2017-08-29T21:00:00,2017-08-30T00:00:00,"Nicer test"

Batch Private Messages Import

A powerful way of sending transactional 1-on-1 messages is to scheduled an import job to send private messages. To do that, select Batch Private Messages Import from the Job Type selector:

Data import notifications job type a7bd40395b

Optionally you can select a previously created template as the basis of your message. You can choose to not check this option if you are going to send simple Text Alerts notifications. For more complex Rich Content messages check this option:

Data import notifications send using template 1cf1e61c60

Search for an existing message template like shown below:

Data import notifications search template 1c095f35b1

And select the template you want to use:

Data import notifications template e2c94cbc57

This type of import will allow you to send messages to a user (and all its devices) or a single devices.

Below you find the .csv file example for this type of import for users:

1
2
3
  userID,message,notification_placeholder1,notification_placeholder2
  joris@notifica.re,"Great stuff {{notification_placeholder1}}. You have now {{notification_placeholder2}} points", Joris, 50
  joel@notifica.re,"Great stuff {{notification_placeholder1}}. You have now {{notification_placeholder2}} points", Joel, 150

Below you find the .csv file example for this type of import for users:

1
2
3
  deviceID,message,notification_placeholder1,notification_placeholder2
  ede4451c18dfa4bc08de6296c472558f17a3f2a38e1593a0c5cd1e8ea6f25171,"Great stuff {{notification_placeholder1}}. You have now {{notification_placeholder2}} points", Joris, 50
  ede4451c18dfa4bc08de6296c472558f17a3f2a38e1593a0c5cd1e8ea6f25171,"Great stuff {{notification_placeholder1}}. You have now {{notification_placeholder2}} points", Joel, 150

Batch Grouped Messages Import

With this type of import job it is possible to create one single messaging campaign to several users, using data from external sources that is unknown to Notificare. To do that, select Batch Grouped Messages Import from the Job Type selector:

Data import notifications job type a7bd40395b

Then you should select a previously created template as the basis of your message. These templates can contain placeholders for any arbitrary data you choose to include in your import file. Assuming you've create a template you should search for it like shown below:

Data import notifications search template 1c095f35b1

And select the template you want to use:

Data import notifications template e2c94cbc57

This type of import will allow you to send messages to a user (and all its devices) or a single devices.

Recognized columns:

  • userID
  • deviceID
  • notification_xxx

Below you find the .csv file example for this type of import for users:

1
2
3
  userID,notification_placeholder1,notification_placeholder2
  joris@notifica.re, Joris, 50
  joel@notifica.re, Joel, 150

Below you find the .csv file example for this type of import for users:

1
2
3
deviceID,notification_placeholder1,notification_placeholder2
ede4451c18dfa4bc08de6296c472558f17a3f2a38e1593a0c5cd1e8ea6f25171, Joris, 50
ede4451c18dfa4bc08de6296c472558f17a3f2a38e1593a0c5cd1e8ea6f25171, Joel, 150

Once you've provided all the data required, you can quickly test if everything is correct by hitting the Test Configuration button before actually saving the task:

Data schedules test configuration button 4d51f57f27

If everything is OK, you will see something like this:

Data schedules test ok 33acd1d982

If there's something wrong, you will see an error like the one below:

Data schedules test error cc2d02090b

Once you ready to made all the changes you want to your scheduled task, click in the Save Task button to save it:

Data schedules save task button f5c392e66f

From this page you can also retry or delete a task. To retry a task, expand the Options menu and click in Retry Task:

Data schedules options menu expanded 1719b1edaa

If you retrying a task that previous failed while transferring the file from the external source, you simply can click in Retry to force it to run again. But if the tasked failed while parsing for any reason, you can force it to run before a certain date, this usually needs to be a date before the last modified time of the file stored in the external source:

Data schedules retry modal 0a8e91e531

Finally, deleting a task will remove it completely form our system.