Chumbak [CPS] IN

Thursday, May 7, 2020

5G Network

5G is the 5th generation mobile network. It is a new global wireless standard after 1G, 2G, 3G, and 4G networks. 5G enables a new kind of network that is designed to connect virtually everyone and everything together including machines, objects, and devices.
5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra low latency, more reliability, massive network capacity, increased availability, and a more uniform user experience to more users. Higher performance and improved efficiency empower new user experiences and connects new industries.
1G, 2G, 3G, and 4G all led to 5G, which is designed to provide more connectivity than was ever available before.
5G is a unified, more capable air interface. It has been designed with an extended capacity to enable next-generation user experiences, empower new deployment models and deliver new services.
With high speeds, superior reliability and negligible latency, 5G will expand the mobile ecosystem into new realms. 5G will impact every industry, making safer transportation, remote healthcare, precision agriculture, digitized logistics — and more — a reality.

5G is based on OFDM (Orthogonal frequency-division multiplexing), a method of modulating a digital signal across several different channels to reduce interference. 5G uses 5G NR air interface alongside OFDM principles. 5G also uses wider bandwidth technologies such as sub-6 GHz and mmWave.

Like 4G LTE, 5G OFDM operates based on the same mobile networking principles. However, the new 5G NR air interface can further enhance OFDM to deliver a much higher degree of flexibility and scalability. This could provide more 5G access to more people and things for a variety of different use cases.

5G will bring wider bandwidths by expanding the usage of spectrum resources, from sub-3 GHz used in 4G to 100 GHz and beyond. 5G can operate in both lower bands (e.g., sub-6 GHz) as well as mmWave (e.g., 24 GHz and up), which will bring extreme capacity, multi-Gbps throughput, and low latency.

5G is designed to not only deliver faster, better mobile broadband services compared to 4G LTE, but can also expand into new service areas such as mission-critical communications and connecting the massive IoT. This is enabled by many new 5G NR air interface design techniques, such as a new self-contained TDD subframe design.
First generation - 1G
1980s: 1G delivered analog voice.
Second generation - 2G
Early 1990s: 2G introduced digital voice (e.g. CDMA- Code Division Multiple Access).
Third generation - 3G
Early 2000s: 3G brought mobile data (e.g. CDMA2000).
Fourth generation - 4G LTE
2010s: 4G LTE ushered in the era of mobile broadband.

1G, 2G, 3G, and 4G all led to 5G, which is designed to provide more connectivity than was ever available before.
5G is a unified, more capable air interface. It has been designed with an extended capacity to enable next-generation user experiences, empower new deployment models and deliver new services.
With high speeds, superior reliability and negligible latency, 5G will expand the mobile ecosystem into new realms. 5G will impact every industry, making safer transportation, remote healthcare, precision agriculture, digitized logistics — and more — a reality.

There are several reasons that 5G will be better than 4G:

• 5G is significantly faster than 4G
• 5G has more capacity than 4G
• 5G has significantly lower latency than 4G
• 5G is a unified platform that is more capable than 4G
• 5G uses spectrum better than 4G

5G is a unified platform that is more capable than 4G.
While 4G LTE focused on delivering much faster mobile broadband services than 3G, 5G is designed to be a unified, more capable platform that not only elevates mobile broadband experiences, but also supports new services such as mission-critical communications and the massive IoT. 5G can also natively support all spectrum types (licensed, shared, unlicensed) and bands (low, mid, high), a wide range of deployment models (from traditional macro-cells to hotspots), and new ways to interconnect (such as device-to-device and multi-hop mesh).

5G uses spectrum better than 4G.
5G is also designed to get the most out of every bit of spectrum across a wide array of available spectrum regulatory paradigms and bands—from low bands below 1 GHz, to mid bands from 1 GHz to 6 GHz, to high bands known as millimeter wave (mmWave).

5G is faster than 4G.
5G can be significantly faster than 4G, delivering up to 20 Gigabits-per-second (Gbps) peak data rates and 100+ Megabits-per-second (Mbps) average data rates.

5G has more capacity than 4G.
5G is designed to support a 100x increase in traffic capacity and network efficiency.1

5G has lower latency than 4G.
5G has significantly lower latency to deliver more instantaneous, real-time access: a 10x decrease in end-to-end latency down to 1ms.
5G is driving global growth.
• $13.2 Trillion dollars of global economic output
• 22.3 Million new jobs created
• $2.1 Trillion dollars in GDP growth
Through a landmark 5G Economy study, we found that 5G’s full economic effect will likely be realized across the globe by 2035—supporting a wide range of industries and potentially enabling up to $13.2 trillion worth of goods and services.
This impact is much greater than previous network generations. The development requirements of the new 5G network are also expanding beyond the traditional mobile networking players to industries such as the automotive industry.
The study also revealed that the 5G value chain (including OEMs, operators, content creators, app developers, and consumers) could alone support up to 22.3 million jobs, or more than one job for every person in Beijing, China. And there are many emerging and new applications that will still be defined in the future. Only time will tell what the full “5G effect” on the economy is going to be.

The advent of 5G is one of the most enigmatic new upcoming technologies that can impact businesses in 2020. Many industry experts have termed 5G as the future of communication and to some extent, it’s true.
“5G wireless networks will support 1,000-fold gains in capacity, connections for at least 100 billion devices and a 10 Gb/s individual user experience of extremely low latency and response times, as stated by Huawei. “Deployment of these networks will emerge between 2020 and 2030.”  
However, to take this speed to everyday mobile users, mobile network carriers will need to increase bandwidth and reduce network costs. Moreover, LTE adoption isn’t waning and is estimated to reach $672 billion by the end of 2020. 


Play poker and win real money

Tuesday, May 5, 2020

Internet of Things (IoT)


The Internet of things IOT is a system of interrelated computing devices, mechanical and digital machines provided with unique identifiers (UIDs) and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.
The definition of the Internet of things has evolved due to the convergence of multiple technologies, real-time analytics, machine learning, commodity sensors, and embedded systems. Traditional fields of embedded systems, wireless sensor networks, control systems, automation (including home and building automation), and others all contribute to enabling the Internet of things. In the consumer market, IOT technology is most synonymous with products pertaining to the concept of the "smart home", covering devices and appliances (such as lighting fixtures, thermostats, home security systems and cameras, and other home appliances) that support one or more common ecosystems, and can be controlled via devices associated with that ecosystem, such as smartphones and smart speakers.
There are a number of serious concerns about dangers in the growth of IOT, especially in the areas of privacy and security, and consequently industry and governmental moves to address these concerns have begun.
Many “things” are now being built with Wi-Fi connectivity, meaning they can be connected to the Internet—and to each other. Hence, the Internet of Things, or IOT. The Internet of Things is the future and has already enabled devices, home appliances, cars and much more to be connected to and exchange data over the Internet. And we’re only in the beginning stages of IOT: the number of IOT devices reached 8.4 billion in 2017 is expected to reach 30 billion devices by 2020.
As consumers, we’re already using and benefitting from IOT. We can lock our doors remotely if we forget to when we leave for work and preheat our ovens on our way home from work, all while tracking our fitness on our Fitbits and hailing a ride with Lyft. But businesses also have much to gain now and in the near future. The IOT can enable better safety, efficiency, and decision making for businesses as data is collected and analyzed. It can enable predictive maintenance, speed up medical care, improve customer service, and offer benefits we haven’t even imagined yet.
However, despite this boon in the development and adoption of IOT, experts say not enough IT professionals are getting trained for IOT jobs. An article at IT Pro Today says we’ll need 200,000 more IT workers that aren’t yet in the pipeline, and that a survey of engineers found 25.7 percent believe inadequate skill levels to be the industry’s biggest obstacle to growth. For someone interested in a career in IOT, that means easy entry into the field if you’re motivated, with a range of options for getting started. Skills needed include IOT security, cloud computing knowledge, data analytics, automation, understanding of embedded systems, device knowledge, to name only a few. After all, it’s the Internet of Things, and those things are many and varied, meaning the skills needed are as well.


Blockchain


Blockchain technology is most simply defined as a decentralized, distributed ledger that records the provenance of a digital asset. 
Blockchain, sometimes referred to as Distributed Ledger Technology (DLT), makes the history of any digital asset unalterable and transparent through the use of decentralization and cryptographic hashing.  
A simple analogy for understanding blockchain technology is a Google Doc. When we create a document and share it with a group of people, the document is distributed instead of copied or transferred. This creates a decentralized distribution chain that gives everyone access to the document at the same time. No one is locked out awaiting changes from another party, while all modifications to the doc are being recorded in real-time, making changes completely transparent.
Of course, blockchain is more complicated than a Google Doc, but the analogy is apt because it illustrates three critical ideas of the technology:
Blockchain is an especially promising and revolutionary technology because it helps reduce risk, stamps out fraud and brings transparency in a scaleable way for myriad uses.
Although most people think of blockchain technology in relation to cryptocurrencies such as Bitcoin, blockchain offers security that is useful in many other ways. In the simplest of terms, blockchain can be described as data you can only add to, not take away from or change. Hence the term “chain” because you’re making a chain of data. Not being able to change the previous blocks is what makes it so secure. In addition, blockchains are consensus-driven, so no one entity can take control of the data. With blockchain, you don’t need a trusted third-party to oversee or validate transactions. You can refer to our Blockchain tutorial for a detailed and thorough understanding of the technology.
Several industries are involving and implementing blockchain, and as the use of blockchain technology increases, so too does the demand for skilled professionals. In that regard, we are already behind. According to Techcrunch.com, blockchain-related jobs are the second-fastest growing category of jobs, with 14 job openings for every one blockchain developer. A blockchain developer specializes in developing and implementing architecture and solutions using blockchain technology. The average yearly salary of a blockchain developer is $130,000. If you are intrigued by Blockchain and its applications and want to make your career in this fast-growing industry, then this is the right time to learn Blockchain and gear up for an exciting future

Sunday, May 3, 2020

Cybersecurity


Cyber security refers to the body of technologies, processes, and practices designed to protect networks, devices, programs, and data from attack, damage, or unauthorized access
Types of Cyber Security are nothing but the techniques used to prevent the stolen or assaulted data.
Types of Cyber Attacks
Denial of Service Attack (DoS)
1.    Hacking.
2.    Malware.
3.    Phishing.
4.    Spoofing.
5.    Ransomware.
6.    Spamming.
Cybersecurity might not seem like emerging technology, given that it has been around for a while, but it is evolving just as other technologies are. That’s in part because threats are constantly new. The malevolent hackers who are trying to illegally access data are not going to give up any time soon, and they will continue to find ways to get through even the toughest security measures. It’s also in part because new technology is being adapted to enhance security. As long as we have hackers, we will have cyber security as an emerging technology because it will constantly evolve to defend against those hackers.
As proof of the strong need for cybersecurity professionals, the number of cybersecurity jobs is growing three times faster than other tech jobs. However, we’re falling short when it comes to filling those jobs. As a result, it’s predicted that we will have 3.5 million unfilled cybersecurity jobs by 2021.
Many cyber security jobs pay six-figure incomes, and roles can range from the ethical hacker to security engineer to Chief Security Officer, offering a promising career path for someone who wants to get into and stick with this domain.


Saturday, May 2, 2020

Cloud computing


Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. The term is generally used to describe data centers available to many users over the Internet. Large clouds, predominant today, often have functions distributed over multiple locations from central servers. If the connection to the user is relatively close, it may be designated an edge server.
Clouds may be limited to a single organization (enterprise clouds), or be available to many organizations (public cloud).
Cloud computing relies on sharing of resources to achieve coherence and economies of scale.
Advocates of public and hybrid clouds note that cloud computing allows companies to avoid or minimize up-front IT infrastructure costs. Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and that it enables IT teams to more rapidly adjust resources to meet fluctuating and unpredictable demand, providing the burst computing capability: high computing power at certain periods of peak demand
Cloud providers typically use a "pay-as-you-go" model, which can lead to unexpected operating expenses if administrators are not familiarized with cloud-pricing models.
The availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture and autonomic and utility computing has led to growth in cloud computing. By 2019, Linux was the most widely used operating system, including in Microsoft's offerings and is thus described as dominant. The Cloud Service Provider (CSP) will screen, keep up and gather data about the firewalls, intrusion identification or/and counteractive action frameworks and information stream inside the network


Edge computing


Edge computing is a distributed computing paradigm which brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth. The origins of edge computing lie in content delivery networks that were created in the late 1990s to serve web and video content from edge servers that were deployed close to users. In the early 2000s, these networks evolved to host applications and application components at the edge servers, resulting in the first commercial edge computing services that hosted applications such as dealer locators, shopping carts, real-time data aggregators, and ad insertion engines. Modern edge computing significantly extends this approach through virtualization technology that make it easier to deploy and run a wider range of applications on the edge servers.

Friday, May 1, 2020

Virtual Reality and Augmented Reality


Virtual Reality (VR) immerses the user in an environment while Augment Reality (AR) enhances their environment. Although VR has primarily been used for gaming thus far, it has also been used for training, as with VirtualShip, a simulation software used to train U.S. Navy, Army and Coast Guard ship captains. The popular Pokemon Go is an example of AR.
Virtual reality (VR) is a simulated experience that can be similar to or completely different from the real world. Applications of virtual reality can include entertainment (i.e. video games) and educational purposes (i.e. medical or military training)
Augmented reality (AR) is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visualauditoryhapticsomatosensory and olfactory
Both VR and AR have enormous potential in training, entertainment, education, marketing, and even rehabilitation after an injury. Either could be used to train doctors to do surgery, offer museum-goers a deeper experience, enhance theme parks, or even enhance marketing, as with this Pepsi Max bus shelter.
There are major players in the VR market, like Google, Samsung, and Oculus, but plenty of startups are forming and they will be hiring, and the demand for professionals with VR and AR skills will only increase. Getting started in VR doesn’t require a lot of specialized knowledge. Basic programming skills and a forward-thinking mindset can land a job, although other employers will be looking for optics as a skill-set and hardware engineers as well.