AWS news

World latest news

Amazon’s Successfully Integrates AWS Cloud Support for its Managed Blockchain Platform

As per a statement issued by retail giant Amazon earlier today, AWS (Amazon Web Services) has been successful in integrating a cloud storage module with its existing blockchain platform. For those of our readers who may not be aware, the multinational will be making use of CloudFormation to support its blockchain management system. It is […]
Bitcoin Exchange Guide

How to Set Up Live and On-Demand Streaming in AWS Media Services

AWS Media Services represent a set of cloud-based solutions created to assist with video processing. They process, store and deliver video content using OTT (Over-the-Top) technology.These services offer a pay-as-you-go pricing model which means they’re great for building scalable solutions within a pretty short timeframe.A bit of theoryUnderneath, I’ve listed the solutions that come into AWS Media Services family and explained what they serve for in a nutshell:AWS Elemental MediaConnect. This is a service optimized for safe and reliable video stream transfer.AWS Elemental MediaConvert. This one is responsible for video files processing and creation of streaming Video on Demand content.AWS Elemental MediaLive. The service allowing to convert video content for broadcasts in a real-time mode.AWS Elemental MediaPackage. It’s used to prepare and protect video content delivery over the internet. Using one source, MediaPackage creates video streams utilizing different formats and standards (MPEG-DASH, Smooth Streaming, HLS, CMAF) for playback on a variety of devices (TVs, PCs, smartphones, tablets, game consoles).AWS Elemental MediaStore. The AWS storage optimized for real-time video content. It provides low read and write latency at a large number of queries so you could give users a high-quality service.AWS Elemental MediaTailor. This service allows video providers to embed ads into their video streams on the server-side.Benefits of AWS Media ServicesSo why AWS Media Services are worth using? First of all, they reduce infrastructure expenses and optimize the content delivery process for broadcast organizers and media content providers.For content owners, they allow creating content in the format of on-demand video for broadcasts of any scale. While media companies can broaden their audience due to OTT technology.The main use casesCreation of video broadcasts in a real-time (Live Streaming).Creation of a Video on Demand content library.Setting up a real-time video streamingLet’s get to practice now.The real-time video streaming can be built by means of the following services:AWS Elemental MediaLive. The conversion/encoding of video and audio streams with the help of various codecs. It’s also in charge of adding subtitles and watermarks.AWS MediaStorage. Optimized media content storage to let you store records of video streams with their further on-demand delivery (optional). It sometimes can function incorrectly, so Amazon recommends using AWS S3.AWS Elemental MediaPackage. The transfer of a recoded video stream to target devices using different technologies (Apple HLS, MPEG-DASH, CMAF, Microsoft Smooth Streaming).AWS CloudFront. The CDN service to fasten the delivery across different geographical regions (optional).Here is a short version of what you need to do for setting up a live video broadcast:Create a channel in AWS MediaPackage.Create an endpoint for the channel.Create a source (input) for the AWS MediaLive channel (and input security group if needed).Create an AWS MediaLive channel to convert video from source and transfer it to AWS MediaPackage.Start video broadcasting.Let’s consider the aforementioned points in more detail.1. Establish a channel in AWS Elemental MediaPackageThe first step to undertake is the creation of AWS MediaPackage channel for processing the recoded video stream. You can do so by using the AWS SDK or developers console.(The following examples are based on working with developers console, but it’s pretty easy to do it with SDK as well.)To create a channel you should specify its ID and description (optional). As for now, AWS Elemental MediaPackage works with HLS protocol only. So keep that in mind.2. Configure endpointsThe next step is to configure endpoints (addresses that end-users need to follow to get a streaming video).Here are all the supported options:Apple HLSMicrosoft SmoothCommon Media Application Format (CMAF)MPEG-DASH3. Tune the MediaLiveOnce MediaPackage is all set you can proceed with tuning AWS MediaLive. First, you need to create the Input (source of a video stream).MediaLive can receive at the input :An MP4 file.A stream transferred via RTMP/ RTP/HLS protocols.A stream transferred using AWS MediaConnect.As an example, let’s create an input receiving a screen record via OBS (Open Broadcaster Software) using RTMP protocol.4. Create a video stream processing channel in MediaLiveFor this, you need to specify the name, create an IAM role (or pick already existing one). Then you should choose the previously created input along with output group (destination for converted media stream) to provide the rights needed for integration with other services.As an output, AWS MediaLive supports:Apple HLSArchiving video stream to S3RTMP (transfer of a video to streaming services like YouTube, Twitch, and others)Microsoft SmoothUDP/RTPIt’s possible to create your own output group or pick one from templates. Let’s take a ‘Live event’ template as an example for the creation of an output HLS stream with the support for four resolutions (240p, 480p, 720p, and 1080p).In the settings of the output stream, you should specify the destination address. (For example, in the case with HLS output, it can be such services as MediaPackage, MediaStore, S3 or Akamai CDN). One more thing to specify is an input address of the MediaPackage channel.After that, you’ll be able to change the video’s encoding settings or add outputs for other screen resolutions and bitrates. Finally, by clicking the ‘Create channel’ button, you’ll create the channel itself.5. Launch MediaLive channelTo finally start broadcasting, you should specify the address you’ve got during the creation of RTMP.The broadcast view will be available at the addresses created in the MediaPackage endpoint.If there is a necessity to save the broadcast, you should use MediaStore or S3 instead of MediaPackage.Setting Up a Video on DemandAWS Elemental MediaConvert is used to deliver on-demand video content. It works similarly to MediaLive with the only difference that MediaLive processes video streams in a real-time, while MediaConvert does the same with video files taken from the S3 storage.Amazon gives developers an open-source solution which takes several steps to be tuned:Load the source file to S3 with a tag for archiving in AWS S3 Glacier.The source video is then validated and the AWS MediaConvert template is chosen based on metadata. The received data are saved to a database.The task (Job) for video processing is created in AWS MediaConvert based on the profile.At the end of video conversion, the result is transferred to S3 and the database record is updated.The processed data from S3 are cached to CloudFront to speed up the delivery across various geographical regions.The bugs are caught with the help of CloudWatch.Underneath I’ve left a schematic picture illustrating how it all works.How to Set Up Live and On-Demand Streaming in AWS Media Services was originally published in on Medium, where people are continuing the conversation by highlighting and responding to this story.

AWS Lambda: how to share code in a monorepo

A while back, a client asked me “how can I share business logic between services in a Node.js monorepo?”. The TL;DR of it is:Encapsulate the shared business logic into modules, and put them in a separate folder.In the Lambda handler functions, reference the shared modules using relative paths.Use webpack to resolve and bundle them into the deployment package. If you use the Serverless framework, then check out the serverless-webpackplugin. For example:Deploy every service on every commit. You can do this with a simple script like the following.To see how everything fits together, check out this demo repo. It has CI/CD set up already through and you can see a recent deployment of all the services in the monorepo here.But wait! How do I choose between having a monorepo vs a repo per service?Don’t worry, I’ve got you covered for that too ;-) You can read about my thoughts on the two approaches here.Hi, my name is Yan Cui. I’m an AWS Serverless Hero and the author of Production-Ready Serverless. I have run production workload at scale in AWS for nearly 10 years and I have been an architect or principal engineer with a variety of industries ranging from banking, e-commerce, sports streaming to mobile gaming. I currently work as an independent consultant focused on AWS and serverless.You can contact me via Email, Twitter and LinkedIn.Check out my new course, Complete Guide to AWS Step Functions.In this course, we’ll cover everything you need to know to use AWS Step Functions service effectively. Including basic concepts, HTTP and event triggers, activities, design patterns and best practices.Get your copy here.Come learn about operational BEST PRACTICES for AWS Lambda: CI/CD, testing & debugging functions locally, logging, monitoring, distributed tracing, canary deployments, config management, authentication & authorization, VPC, security, error handling, and more.You can also get 40% off the face price with the code ytcui.Get your copy here.Originally published at on June 29, 2019.AWS Lambda: how to share code in a monorepo was originally published in Hacker Noon on Medium, where people are continuing the conversation by highlighting and responding to this story.

Blockchain Adoption in the Global Agri-Food Industry, 2019 - Analysis on Top Tech Companies IBM, SAP, Microsoft, Accenture, and AWS -

The "Blockchain - Adoption in Agri-Food Industry" report has been added to's offering. After proving its mettle in the cryptocurrency world, Blockchain (BC), a distributed open ledger technology, has started disrupting other industries such as healthcare, automotive, BFSI, transportation, etc. In the agri-food industry, where the supply chain is a vital part of the industry, BC-based solutions facilitate trust and acceptability. The outbreak of food-borne illnesses, food frauds and mishandling of produce, has only added to the need for developing track and trace solutions, available with a BC-enabled solution. Over the years, BC technology has grown and has had a positive influence on the agri-food industry. This trend can only be expected to strengthen over time. BC solutions for the agri-food industry range from managing food wastes to cost reduction. It has also addressed solutions for crop insurance, fair-trade practices, and established direct contact between consumers and producers. This report includes a comprehensive analysis of the adoption of BC technology in the agri-food industry and highlights the major trends and opportunities across the ecosystem. Blockchain Impact & Adoption Trend Analysis This section of the report identifies the segments of the agri-food industry from the perspective of BC technology. This includes a summary of the structure and ecosystem, foundational elements, types of BC and their uses, and a detailed analysis of the impact of this technology on the agri-food industry as a whole. The section also provides a detailed analysis of the various use ...Full story available on

Building Serverless Data Lake with AWS Glue DynamoDB and Athena

State Library of VictoriaIn this post, we will be building a serverless data lake solution using AWS Glue, DynamoDB, S3 and Athena.So combining everything, we do the following steps:Create Dynamo tables and Insert Data.Create a crawler that reads the Dynamo tables.Create Glue ETL jobs that reads the data and stores it in an S3 bucket.Create Data Catalog Tables that reads the S3 bucket.Use Athena to query information using the crawler created in the previous step.Lets get startedFirst we need to create DynamoDB table and insert data. In our case we are going to use AWS CLI:let’s create sample tables: Forum, Thread and Reply.$aws dynamodb create-table --table-name Forum --attribute-definitions AttributeName=Name,AttributeType=S --key-schema AttributeName=Name,KeyType=HASH --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=1 --region ap-southeast-2$aws dynamodb create-table --table-name Thread --attribute-definitions AttributeName=ForumName,AttributeType=S AttributeName=Subject,AttributeType=S --key-schema AttributeName=ForumName,KeyType=HASH AttributeName=Subject,KeyType=RANGE --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=1 --region ap-southeast-2$aws dynamodb create-table --table-name Reply --attribute-definitions AttributeName=Id,AttributeType=S AttributeName=ReplyDateTime,AttributeType=S --key-schema AttributeName=Id,KeyType=HASH AttributeName=ReplyDateTime,KeyType=RANGE --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=1 --region ap-southeast-2Now, Inserting sample data to created tables using BatchWriteItem$aws dynamodb batch-write-item --request-items file://forum.json --region ap-southeast-2$aws dynamodb batch-write-item --request-items file://thread.json --region ap-southeast-2$aws dynamodb batch-write-item --request-items file://reply.json --region ap-southeast-2 Glue CrawlerNow we have tables and data, let’s create a crawler that reads the Dynamo tables.Open the AWS Glue console, create a new database demo.Then add a new Glue Crawler to add the Parquet and enriched data in S3 to the AWS Glue Data Catalog, making it available to Athena for queries.In Data stores step, select DynamoDB as data source and select Forum table:A crawler can crawl multiple data stores in a single run, in our case we will need to add three tables as data store:Next, select Create an IAM role and name the IAM role in IAM Role step:Then set schedule for the crawler, in our case i set to Run on demand :Select demo as Database and dynamodb as prefix tables on next step:After Crawler added, we should see the crawler that were created in AWS Glue console, let’s run the Crawler:Upon completion, the crawler creates or updates one or more tables in our Data Catalog, we should see the tables created as below:AWS Glue ETL JobAWS Glue provides a managed Apache Spark environment to run your ETL job without maintaining any infrastructure with a pay as you go model.AWS Glue ETL job extracts data from our source data and write the results into S3 bucket, let’s create a S3 bucket using CLI:$aws s3api create-bucket --bucket aws-glue-forum.reply.thread.demos --create-bucket-configuration LocationConstraint=ap-southeast-2 --region ap-southeast-2Open the AWS Glue console and choose Jobs under the ETL section to start creating an AWS Glue ETL job:Select dynamodbforum data source in Data source step:On the next step, choose your raw Amazon S3 bucket as the data source, and choose Next. On the Data target page, choose the processed Amazon S3 bucket as the data target path, and choose Parquet as the Format.Lastly, review your job parameters, and choose Save Job and Edit Script, as shown following.On the next page, we will need to modify the script to prevent duplicates data generated from each JOB execution, add following code to job script:import boto3s3 = boto3.resource('s3')bucket = s3.Bucket('aws-glue-forum.reply.thread.demos')bucket.objects.filter(Prefix="forum/").delete()Full script should look like:, we have our forum ETL job created, repeat above steps to create other 2 jobs. Once three job created, we can automate the execution of this ETL jobs from Job trigger, in our case we will run all of the jobs manually:Okay all done! This is where the fun begins! Let’s create tables entry in AWS Glue for the resulting table data in Amazon S3, so you can analyze that data with Athena using standard SQL.AWS AthenaOpen AWS Athena console, choose demo database, we should be on a page similar to the one shown in the following screenshot:Paste following query to create three tables in query editor:CREATE EXTERNAL TABLE IF NOT EXISTS (`threads` bigint,`category` string,`messages` bigint,`views` bigint,`name` string)ROW FORMAT SERDE ''WITH SERDEPROPERTIES ('serialization.format' = '1') LOCATION 's3://aws-glue-forum.reply.thread.demos/forum/'TBLPROPERTIES ('has_encrypted_data'='false');CREATE EXTERNAL TABLE IF NOT EXISTS demo.reply (`replydatetime` string,`message` string,`postedby` string,`id` string)ROW FORMAT SERDE ''WITH SERDEPROPERTIES ('serialization.format' = '1') LOCATION 's3://aws-glue-forum.reply.thread.demos/reply/'TBLPROPERTIES ('has_encrypted_data'='false');CREATE EXTERNAL TABLE IF NOT EXISTS demo.thread (`views` bigint,`message` string,`lastposteddatetime` bigint,`forumname` string,`lastpostedby` string,`replies` bigint,`answered` bigint,`tags` array<string>,`subject` string)ROW FORMAT SERDE ''WITH SERDEPROPERTIES ('serialization.format' = '1') LOCATION 's3://aws-glue-forum.reply.thread.demos/thread/'TBLPROPERTIES ('has_encrypted_data'='false');Now, try the following analytical queries against this data. Choose Run query to run each query, and view the output under Results.Example query:select thread.subject as thread,forum.category , reply.message as reply from thread left join forum on thread.forumname = left join reply on concat(thread.forumname,'#',thread.subject) like order by thread;Example results:Amazon Athena query engine is based on Presto. For more information about these functions, see Presto 0.172 Functions and Operators.That’s about it! Thanks for reading! I hope you have found this article useful, You can find the complete project in my GitHub repo.Building Serverless Data Lake with AWS Glue DynamoDB and Athena was originally published in Hacker Noon on Medium, where people are continuing the conversation by highlighting and responding to this story.
More news sources

AWS news by Finrazor


Hot news

Hot world news

Bitcoin Struggles As BAT And ETC Lead The Charge

The cryptocurrency market has somewhat stabilized, presenting a predominantly green landscape as Bitcoin struggles to stay above the psychological $10,000 marker.   Cryptocurrency Market Situation. Source: Coin360   Sentiment for Bitcoin has seen a moderate improvement towards a strongly neutral outlook.   Bitcoin Sentiment Chart by   Despite this overall lukewarm performance, proponents of Bitcoin’s store of value potential have reason to rejoice today. According to a report by Digital Asset Data, BTC is increasingly gaining correlation with the broader asset markets – positively with gold, and negatively with the stock market. With a recession looming on the global economy, Bitcoin could fulfill the role of a “safe haven asset,” to which investors flock during uncertain economic times. Altcoins see recovery with ETC and BAT leading The rest of the cryptocurrency market is seeing strong corrections from yesterday’s fall, with two strong outliers making significant gains. Basic Attention Token is strongly reacting to its new listing on Kraken, a popular exchange part of Weiss Ratings’ Real 10 index of platforms reporting true volume. BAT has gained more than 15% on yesterday’s price, while curiously its partner-in-listing WAVES has registered a much more modest 4%. The stark difference can be explained by the contribution of other fundamental drivers, with BAT recently launching the much-anticipated online tipping feature in its browser.   BAT Recent Price Trend. Source: CoinMarketCap   The other outlier is none other than Ethereum Classic, which after an against-the-grain rally on Tuesday has continued today with a 14.6% gain. The total 7-day performance is a solid +21%, by far the highest in the top-50. As before, it’s difficult to give a meaningful explanation of the rally. The upcoming ETC Atlantis hard fork is scheduled in about 22 days, too far ahead to justify any price action, though the final confirmation was released on Monday. The run is likely to be due to a combination of factors, including possible whales entering ETC positions. The rest of the altcoin market is seeing moderate recoveries, with IOTA, TRON and Cardano gaining 8%, 6.32% and 6.42% respectively as the rest average on 3-4%.   The post Bitcoin Struggles As BAT And ETC Lead The Charge appeared first on Crypto Briefing.

Casa Unveiled Node Monitor Service to Leapfrog Bitcoin Network Health

Coinspeaker Casa Unveiled Node Monitor Service to Leapfrog Bitcoin Network HealthA famous crypto startup firm recognized as Casa that offers primary management service, and Bitcoin node machine has launched a node monitor as well as accompanying reward program to develop Bitcoin network health.The firm revealed the latest innovation in an official website article on Aug. 21. Per the announcement, the node monitor known as Node Heartbeats depends on creating a brief relationship between the server of Casa plus, an internet synced and Tor-activated node owned by a user. The rewards program enables Casa node subscribers to earn 10,000 SatsBack weekly in exchange for operating 5 Node Heartbeat checks weekly, on separate days.SetBack can reportedly be converted for Bitcoin (BTC) just once per day on Keymaster app for Casa, as long as a consumer has garnered a minimum of 50,000 SatBack points. According to the report, Casa is firmly convinced that it`s hectic for consumers to keep up with the trend on their node`s uptime as well as security. By offering an inducement program to leapfrog node health, the firm reportedly expects to enhance the overall health of the Bitcoin network.How the Node Heartbeat OperatesTo verify that node is online, the company makes a concise link from their server to the users Casa Node. For this to be manageable, the user`s node must be synced, online, and with Tor activated. But with Tor, that indicates the node Heartbeat secures user`s privacy.Speaking of privacy, the Node Heartbeat only utilises the connection code of the consumer-the same code that is already existing in explorers and that which the consumer release to others who are willing to launch a channel with them. Sats App automatically submits the code for the users when they send a Heatbeat to saves them from the agony of looking it up.Guess Who Is The Latest Casa Investor?Charlie Lee, Litecoin (LTC) founder revealed just three days ago that he has heavily invested in Casa. He went ahead to praise Casa for spearheading BTC acceptance, indicating:“I have the same feeling about Casa today as I had about Coinbase when I joined in 2013 as the 3rd hire. Casa is making Bitcoin easy to use, and that is extremely important for this space. Looking forward to great things!”New Node Monitor by Lightning LabsAs initially reported by Cointelegraph that the designer of the high-speed transaction protocol Lightning Network, Lightning Labs recently unveiled an alpha version of a node monitor. The invented device, called Indmon, allegedly enables nodes operators to supervise node functions in real-time. Network cases this year reportedly influenced the developers to design a tool for preemptively detecting network and node issues.Casa Unveiled Node Monitor Service to Leapfrog Bitcoin Network Health

Gemini Exchange Launches In Australia in Effort to Expand its ‘Crypto Needs Rules’ Brand

One of the most prominent exchanges, Gemini which is owned by Winklevoss Twins is now available for Australian crypto users. “Cryptocurrency is the future of money, and we're committed to building a bridge to that future in Australia.” – @tylerwinklevoss We are thrilled to announce that starting today, we are operational in Australia 🇦🇺 Read […]
Bitcoin Exchange Guide

Tether to Launch RMB Stablecoin "CNHT"? CryptoNews

Tether is rumored to launch RMB Stablecoin. Such a move could land crypto in serious trouble in China as the RMB is tightly controlled and regulated. Binance Venus is also on its tails. Ukranian miners accidentally leak sensitive Nuclear Powerplant data. #Cryptocurrency #crypto #Tether RMB Stable Coin 👍🏻Subscribe to Boxmining for Cryptocurrency Insight and News: 🔒Hardware Wallet: 👍🏻Brave Browser: 📲Enjin Wallet: 👍🏻Unstoppable Domain: 📲Binance Exchange : #Bitcoin #Ethereum #Cryptocurrency ●▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬● Telegram groups (Discuss Crypto with us!) Telegram Discussion Group: Telegram Announcements: ●▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬● ♨️Social (Add me on Social Media!) Instagram: Twitter: Facebook: Steemit: ●▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬● I'm not a professional financial adviser and you should always do your own research. I may hold the cryptocurrencies talked about in the video. ●▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬●

Kraken To List Basic Attention Token (BAT) and Waves

Kraken has announced they will be listing BAT and WAVES, with pairs denominated in USD, EUR, BTC, and ETH. Trading commences at 13:30 UTC on August 22, 2019, and deposits will take 30 and 10 confirmations before being credited to users Kraken balances for BAT and WAVES, respectively, August 21, 2019. Kraken Expands Altcoin OfferingsRead MoreRead More. The post by Ashwath Balakrishnan appeared first on BTCManager, Bitcoin, Blockchain & Cryptocurrency News\
BTC Manager
By continuing to browse, you agree to the use of cookies. Read Privacy Policy to know more or withdraw your consent.