AppFlow, a new service that facilitates the transfer of data between AWS and SaaS

Amazon unveiled recently the launch of «AppFlow», a new integration service that facilitates data transfer between AWS and SaaS applications (such as Google Analytics, Marketo, Salesforce, ServiceNow, Slack, Snowflake, and Zendesk).

The interesting by AppFlow is the ease it offers to be able to transfer data between AWS services and other SaaS applications. AppFlow makes it possible to create these integrations without the need to write custom code, a process that can sometimes take months to complete, due to data cleansing requirements and more.

Amazon App Flow also works with AWS PrivateLink to route data flows through the AWS network rather than the public internet to provide even stronger data security and privacy.

About AppFlow

AppFlow integration takes place in a reduced sequence of steps, without coding or use of special connectors. Automated flows can run on a large scale, at the chosen frequency: they can be scheduled, activated through a commercial event or launched on demand. In the case of CRM, one-time data transfer can be done, for example, during the conversion of a lead, when opening a customer file, or when signing an agreement.

Millions of customers run applications, data lakes, large-scale analytics, machine learning, and IoT workloads on AWS. These customers also often have data stored in dozens of SaaS applications, resulting in silos that are disconnected from the data stored on AWS. Organizations want to be able to combine their data from all of these sources, but that requires customers to spend days writing code to build custom connectors and data transformations to convert disparate data types and formats across different SaaS applications.

AppFlow is for companies and organizations that want to store and process data from multiple applications SaaS on AWS for analysis, to create machine learning models, or to collect data from IoT applications.

This managed service automates two-way data exchanges between SaaS software and AWS services such as S3 storage, Redshift or Aurora database, SageMaker to create machine learning models, etc. or still third party services such as the Snowflake datawarehouse. Integration is made easy with CRM solutions like Salesforce, ITSM like ServiceNow or support solutions like Zendesk, collaboration like Slack instant messaging or e-commerce like Marketo.

Because companies with few developers on staff may resort to creating manual entries and data exports (which introduces and increases the risk of human error in advanced data and machine learning models as well as opens up the potential for data leaks) Amazon AppFlow solves these problems and enables clients with various technical skills.

With just a few clicks in the Amazon AppFlow console, customers can configure multiple types of triggers for their data flows, including one-time requested transfers, data syncs scheduled at predetermined times, or event-driven transfers when starting a campaign.

For example, transform and process data by combining fields (to calculate new values), filtering records (to reduce noise), masking sensitive data (to ensure privacy), and validating field values ​​(to clean data).

Amazon AppFlow automatically encrypts data at rest and in motion using AWS or customer-managed encryption keys, and allows users to restrict the flow of data over the public Internet for applications that are integrated with AWS PrivateLink, reducing exposure to security threats.

Customers can start using Amazon AppFlow's simple interface to create and run data flows between sources in minutes and Amazon AppFlow securely organizes and executes data transfer.

For those who are interested in this service, they should know that there are no initial charges or fees for using Amazon AppFlow, they will only be charged only for the number of flows they execute and the volume of data processed, in addition to that there is a post where AWS explains how to use AppFlow to transfer conversations from Slack to S3 for analysis with Athena and view with QuickSight.


Be the first to comment

Leave a Comment

Your email address will not be published. Required fields are marked with *

*

*

  1. Responsible for the data: Miguel Ángel Gatón
  2. Purpose of the data: Control SPAM, comment management.
  3. Legitimation: Your consent
  4. Communication of the data: The data will not be communicated to third parties except by legal obligation.
  5. Data storage: Database hosted by Occentus Networks (EU)
  6. Rights: At any time you can limit, recover and delete your information.