Эпизоды
-
AWS has launched Amazon Sumerian. Sumerian lets you create and run virtual reality (VR), augmented reality (AR), and 3D applications quickly and easily without requiring any specialized programming or 3D graphics expertise. In this session, we will introduce you to Sumerian, and how you can build highly immersive and interactive scenes for the enterprise that run on popular hardware such as Oculus Rift, HTC Vive, and iOS mobile devices.
-
AWS has launched Amazon Sumerian. Sumerian lets you create and run virtual reality (VR), augmented reality (AR), and 3D applications quickly and easily without requiring any specialized programming or 3D graphics expertise. In this session, we will dive deep into details about Sumerian so you can see what's under the hood. We will cover creating a project, using the visual state machine, connecting an Amazon Sumerian scene to AWS services, and using a Sumerian Host to add presence to your applications.
-
Пропущенные эпизоды?
-
Join us to hear about our strategy for driving machine learning innovation for our customers and learn what's new from AWS in the machine learning space. Swami Sivasubramanian, VP of Amazon Machine Learning, will discuss and demonstrate the latest new services for ML on AWS: Amazon SageMaker, AWS DeepLens, Amazon Rekogntion Video, Amazon Translate, Amazon Transcribe, and Amazon Comprehend. Attend this session to understand how to make the most of machine learning in the cloud.
-
Learn what it takes to migrate an on-premises database to Amazon Relational Database Service (Amazon RDS) for SQL Server, and how you can take advantage of the features and options available in the fully managed Amazon RDS platform. This session walks through best practices for system sizing and configuration, various database migration strategies, and how to leverage your existing authentication system using Amazon RDS.
-
The need for Natural Language Processing (NLP) is gaining more importance as the amount of unstructured text data doubles every 18 months and customers are looking to extend their existing analytics workloads to include natural language capabilities. Historically, this data had been prohibitively expensive to store and early manual processing evolved into rule-based systems, which were expensive to operate and inflexible. In this session we will show you how you can address this problem using Amazon Comprehend.
-
Amazon Relational Database Service (Amazon RDS) simplifies setup, operation, and management of databases in the cloud. In this session, we will explore Amazon RDS features and best practices that offer graceful migration, high performance, elastic scaling, and high availability for Oracle databases. You will also learn from the Chief Architect for Intuit's Small Business Division how the QuickBooks Online team is using Amazon RDS for Oracle to scale the world's largest online accounting platform.
-
Building a conversational AI experience that can respond to a wide variety of inputs and situations depends on gathering high-quality, relevant training data. Dialog with humans is an important part of this training process. In this session, learn how researchers at Facebook use Amazon Mechanical Turk within the ParlAI (pronounced “parlay”) framework for training and evaluating AI models to perform data collection, human training, and human evaluation. Learn how you can use this interface to gather high-quality training data to build next-generation chatbots and conversational agents.
-
PostgreSQL is an open source database growing in popularity because of its rich features, vibrant community, and compatibility with commercial databases. Learn about ways to run PostgreSQL on AWS including self-managed, and the managed database services from AWS: Amazon Relational Database Service (Amazon RDS) and the Amazon Aurora PostgreSQL-compatible Edition. This talk covers key Amazon RDS for PostgreSQL functionality, availability, and management. We also review general guidelines for common user operations and activities such as migration, tuning, and monitoring for their RDS for PostgreSQL instances.
-
Ever since the term “crowdsourcing” was coined in 2006, it's been a buzzword for technology companies and social institutions. In the technology sector, crowdsourcing is instrumental for verifying machine learning algorithms, which, in turn, improves the user's experience. In this session, we explore how Pinterest adapted to an increased reliability on human evaluation to improve their product, with a focus on how they've integrated with Mechanical Turk's platform. This presentation is aimed at engineers, analysts, program managers, and product managers who are interested in how companies rely on Mechanical Turk's human evaluation platform to better understand content and improve machine learning algorithms. The discussion focuses on the analysis and product decisions related to building a high quality crowdsourcing system that takes advantage of Mechanical Turk's powerful worker community.
-
Aurora is a cloud-optimized relational database that combines the speed and availability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases. In this session, we will discuss various options for migrating to Aurora with MySQL compatibility, pro and cons of each, and which method is preferred when. Migrating to Aurora is just the first step. We'll share common use cases and how you can run optimally on Aurora.
-
Artificial intelligence is going to be part of every software workload in the not-too-distant future. Partnering with AWS, Intel is dedicated to bringing the best full-stack solutions to help solve business and societal problems by helping turn massive datasets into information. Thorn is a non-profit organization, co-founded by Ashton Kutcher, focused on using technology innovation to combat child sexual exploitation. It is using MemSQL to provide a new approach to machine learning and real-time image recognition by making use of the high-performance Intel SIMD vector dot product functionality. This session covers machine learning on Intel Xeon processor based platforms and features speakers from Intel, Thorn, and MemSQL. Session sponsored by Intel
-
Tatsuo Ishii from SRA OSS has done extensive testing to compare the Aurora PostgreSQL-compatible Edition with standard PostgreSQL. In this session, he will present his performance testing results, and his work on Pgpool-II with Aurora; Pgpool-II is an open source tool which provides load balancing, connection pooling, and connection management for PostgreSQL.
-
In this talk, you will learn how to use, or create Deep Learning architectures for Image Recognition and other neural network computations in Apache Spark. Alex, Tim and Sujee will begin with an introduction to Deep Learning using BigDL. Then they will explain and demonstrate how image recognition works using step by step diagrams, and code which will give you a fundamental understanding of how you can perform image recognition tasks within Apache Spark. Then, they will give a quick overview of how to perform image recognition on a much larger dataset using the Inception architecture. BigDL was created specifically for Spark and takes advantage of Spark's ability to distribute data processing workloads across many nodes. As an attendee in this session, you will learn how to run the demos on your laptop, on your own cluster, or use the BigDL AMI in the AWS Marketplace. Either way, you walk away with a much better understanding of how to run deep learning workloads using Apache Spark with BigDL. Session sponsored by Intel
-
In this introductory session, we look at how to convert and migrate your commercial databases and data warehouses to the cloud and gain your database freedom. AWS Database Migration Service (AWS DMS) and AWS Schema Conversion Tool (AWS SCT) have been used to migrate tens of thousands of databases. These include Oracle and SQL Server to Amazon Aurora, Teradata and Netezza to Amazon Redshift, MongoDB to Amazon DynamoDB, and many other data source and target combinations. Learn how to easily and securely migrate your data and procedural code, enjoy flexibility and cost savings, and gain new opportunities.
-
Amazon SageMaker is a fully-managed service that enables data scientists and developers to quickly and easily build, train, and deploy machine learning models, at scale. This session will introduce you the features of Amazon SageMaker, including a one-click training environment, highly-optimized machine learning algorithms with built-in model tuning, and deployment without engineering effort. With zero-setup required, Amazon SageMaker significantly decreases your training time and overall cost of building production machine learning systems. You'll also hear how and why Intuit is using Amazon SaeMaker on AWS for real-time fraud detection.
-
Amazon Neptune is a fully managed graph database service which has been built ground up for handling rich highly connected data. Graph databases have diverse use cases across multiple industries; examples include recommendation engines, knowledge graphs, fraud detection, social networks, network management and life sciences. Amazon Neptune is open and flexible with support for Apache TinkerPop and RDF/SPARQL standards. Under the hood Neptune uses the same foundational building blocks as Amazon Aurora which gives it high performance, availability and durability. In this session, we will do a deep dive into capabilities, performance and key innovations in Amazon Neptune.
-
In this session, you'll learn how to leverage AWS Service Catalog, AWS Lambda, AWS Config and AWS CloudFormation to create a robust, agile environment while maintaining enterprise standards, controls and workflows. Fannie Mae demonstrates how they are leveraging this solution to integrate with their existing workflows and CMDB/ITSM systems to create an end-to-end automated and agile IT lifecycle and workflow.
-
In this session, we will provide an overview of Amazon Neptune, AWS's newest database service. Amazon Neptune is a fast, reliable graph database that makes it easy to build applications over highly connected data. We will then explore how Siemens is building a knowledge using Amazon Neptune.
-
Organizations use application delivery controllers (ADCs) to ensure that their most important applications receive the best performance across their network. In this session, you learn how and why Salesforce used the F5 BIG-IP platform, an ADC solution from AWS Marketplace, during a migration to AWS. To preserve an existing skillset within their business, Salesforce chose AWS Marketplace to first evaluate the solution on the AWS platform before ultimately selecting it as part of their international rollout. You see how BIG-IP performs application routing and security, and how it works with existing AWS networking solutions to provide a consistent experience for domestic and international rollouts. You also learn how Salesforce successfully used the AWS Marketplace Private Offers program to procure an enterprise license and consolidate the expenditure onto their AWS bill.
-
In this session, we introduce you to the best practices for migrating databases, such as traditional RDBMS or other NoSQL databases to Amazon DynamoDB. We discuss DynamoDB key concepts, evaluation criteria, data modeling in DynamoDB, how to move data into DynamoDB, and data migration key considerations. We share a case study of Samsung Electronics, which migrated their Cassandra cluster to DynamoDB for their Samsung Cloud workload.
- Показать больше