FacebookGoogle PlusLinkedIn

Archive

Posts Tagged ‘mdm’

Windows Mixed Reality headsets

Wednesday, October 18th, 2017

Made by Microsoft’s PC maker partners, the designs for the new headsets were shown at IFA Berlin, one of the world’s biggest consumer technology trade shows. Prices start at $299 for the headsets, but expect to pay an additional $100 to get them bundled with motion controllers. Those Windows mixed reality headsets Microsoft teased nearly a year ago are now shipping with the rollout of Windows 10 Fall Creators Update.  Although the pricing was initially more affordable than competitors, once the Windows Mixed Reality headsets are bundled with controllers they cost the same or more than the current $399 Oculus Rift bundle. They are still cheaper than HTC Vive, and they don’t require a super-powerful PC to run them.

Also, unlike Oculus and Vive, WMR headsets use a pair of front-mounted cameras and a set of built-in sensors to map your physical position. Called inside-out tracking, the design allows for six-degrees-of-freedom movement tracking without the need to buy external sensors and set them up in a dedicated space. They’re made to be plug-and-play for the most part, too, so you can be up and running in minutes just about anywhere. However, since they’re all designed to meet Microsoft’s specific requirements, there aren’t huge differences between the headsets. The first five headsets announced have the same basic set of specs:

  • Two high-resolution 1,440×1,440-pixel LCDs with up to 90Hz native refresh rate
  • Front-hinged display for quickly lifting the viewer up and out of the way
  • Built-in 3.5mm jack for audio and microphone support
  • Single cable with HDMI 2.0 and USB 3.0 for video and data
  • 4-meter (13.1-ft.) cable

Samsung’s Odyssey headset offers a slight variation to the formula by using 1,440×1,600-pixel AMOLED displays and skipping the flip-up design. Otherwise, at least for this first batch, the differences seem to come down to overall design. And even those don’t vary too much. Lenovo’s entry into the headset market is perhaps the most boardroom-ready in appearance. The Explorer follows the same design and feature sets as the others. One nice little extra, though, is that Lenovo will have a set of its own apps available for use with the headset through its own entertainment hub. The Lenovo Explorer is available now for $399 with a set of motion controllers.

IT GURUS OF ATLANTA LLC

“All Your Company Information Technology Needs Under One Company”

3355 Lenox Road Atlanta, GA 30326

www.itgurusatl.comcustomerservice@itgurusatl.com|

(888) 511-0143 OR (706) 406-5914

“As a registered SAM.gov company, we service government entities across the entire US and Canada”

“LIKE” us on Facebook at: http://www.facebook.com/itgurusatl / “FOLLOW US” on Twitter: http://www.twitter.com/itgurusatl / “CONNECT” with us on LinkedIn: http://www.linkedin.com/in/itgurusofatlanta/

SUBSCRIBE TO OUR NEWSFEED AND UPDATES AT: http://eepurl.com/cU_t7r

Wi-Fi security flaw puts wireless devices at risk of hijack

Monday, October 16th, 2017

Researchers have discovered a flaw in the security protocol that’s a fixture in almost every modern Wi-Fi device, including computers, phones and routers. A weakness in the WPA2 protocol, meant to protect both wireless networks and devices, was discovered by computer security academic Mathy Vanhoef, and is being nicknamed “KRACK,” short for Key Reinstallation Attack. The bug ultimately could allow hackers to eavesdrop on network traffic — bad news for anyone sending sensitive or private information over a Wi-Fi connection. These days, that’s pretty much all of us, although this could hit businesses using wireless point-of-sale machines particularly hard. You use Wi-Fi every day you may even be on it right this very moment and that means the device you’re using is at serious risk of being hijacked.

In May and June, ransomware attacks locked up computers around the world, demanding payment from people and companies in return for renewed access to vital information and systems. More recently came the hack at Equifax, which compromised the person details of 145 million Americans, and the latest shoe to drop in the matter of Yahoo’s massive hack, which hit a breathtaking 3 billion accounts. And it comes on top of a seemingly endless string of bad news in general about security vulnerabilities, whether still in a potential state or actually exploited by hackers.

In the case of KRACK, hackers would have to be within physical range of a vulnerable device to take advantage of the flaw, but if they’re in the right spot, they could use it to decrypt network traffic, hijack connections and inject content into the traffic stream. To do so would involve effectively impersonating a user who had already been granted access to the network so as to exploit a weakness in the secure four-way handshake that acts as its gatekeeper.

IT GURUS OF ATLANTA LLC

“All Your Company Information Technology Needs Under One Company”

3355 Lenox Road Atlanta, GA 30326

www.itgurusatl.comcustomerservice@itgurusatl.com|

(888) 511-0143 OR (706) 406-5914

“As a registered SAM.gov company, we service government entities across the entire US and Canada”

“LIKE” us on Facebook at: http://www.facebook.com/itgurusatl / “FOLLOW US” on Twitter: http://www.twitter.com/itgurusatl / “CONNECT” with us on LinkedIn: http://www.linkedin.com/in/itgurusofatlanta/

SUBSCRIBE TO OUR NEWSFEED AND UPDATES AT: http://eepurl.com/cU_t7r

The Cloud! What is it? – The Cloud Explained

Tuesday, October 10th, 2017

The new technical term which is spreading all over the world like wild fire is the “cloud”.  Join IT GURUS OF ATLANTA, a Microsoft Partner on this journey to show how we arrived at the Cloud, its importance, and why companies are running to get their data on the cloud.  So, let’s start from the top, to figure what people think of when hearing the words “The Cloud”.  When at first hearing the term, the typical mind may go to the thoughts of cloud in the sky and a more technical mind would think virtual. Well to be honest, both answers are not too far from each other. The cloud in more technical terms is the elimination of physical equipment such as servers which are typically used on-premise such as in server rooms which a lot of companies and government agencies house their pertinent data. All traffic goes from the user, computer, over the internet or intranet, and then ends up being housed somewhere. From the websites that we all visit such as Google, Yahoo, CNN, or our favorite channels such as HBO, Showtime, and more are located on servers. These huge data storage devices can be further divvied up into other storage devices such as SAN (Storage Area Network) or NAS (Network Attached Storage). All data from the internet, a person’s computer is all stored somewhere and these serves store the data from the devices.

Another thought that people have is, what happens to the data that is stored on my computer? Well the storage on your computer is very limited. This means that even the larger computer storage that can be installed on laptop or desktop such as 1TB (1 terabyte) or 3TB (3 terabytes) of data is still not enough to house all the data that a company or a user may have. This causes for companies to seek out other means of storage. Years, ago companies went to portable storage devices such as disks, CDs, and portable storage devices. Then the industry realized that trusting users to handle the data and its security is where the trust line and ability for accidents gets blurred. A company and individual’s life’s work can be housed in the data that is being tallied by users. That’s where servers came into a big relief. Servers could be stored locally at the company’s designated location where the user data is uploaded and stored. The industry saw the benefit and it wasn’t long before the servers began to run redundancy which means that the servers had a back, and the backup had a backup. This increases the availability of the data in case of natural or unnatural disasters and data breaches. The data can be utilized even if whole data centers housing thousands of servers go down.  The data is immediately available by its counterpart data centers which already backed up the latest data upload. The user on the other end does not experience more than a second or millisecond in lag time between the data centers and accessing the needed data.

Data centers have become the wave of the future by housing servers, NAS, and SANs which have unprecedented amounts of storage. Unfortunately, due to the cost of owning a data center, companies only kept redundant systems at a data center and kept most of their key systems on premise. These key systems involved constant watch of a human eye to ensure high availability and engineers to do maintenance on the servers. Buying a server, maintaining the server, which includes applying the latest security patches, load balancing, error resolution, and hardware updating is a constant cost and struggle for companies to maintain. However, with this cost of doing business, most companies have accepted the cost to maintain data. Data is the heartbeat of any company, organization, or government agency. As everything that businesses and individuals do today become increasingly digital is where the data and the storage of this data becomes increasingly needed.

 

One of the aspects of technology which has taken a huge increase over the years is the ability to go virtual. Companies such as VMWare came out with VMWare, Microsoft with Hyper-V, and Citrix with XenServer. What were these products? Well to be frank, this is another link in the chain for the “Cloud”. These companies came up with virtual servers. These storage and computing devices could be housed on a physical server. This means that one server with enough memory, storage, and processing power could house 2-15 or more servers virtually. So many say, how is that done? Well from a high level, what Microsoft, Citrix, and VMWare did was allow servers to be created from each processor and core on a physical server and virtually partition the space on the physical server to be dedicated to each virtual server. In layman’s terms this means if a server has 4 processors, 16 gigs of RAM (Random Access Memory), and 120 Gigs of physical memory, then the virtual software could dedicate physical attributes to one or many more virtual servers.  An example of this is a virtual server using 1 processor, 2 gigs or ram, and 20 gigs of physical space for the predefined server to a spin up virtual server. This means that the physical server could house multiple virtual servers based on its physical attributes.

This breakthrough quickly became a trend for top companies to follow.  Going virtual saved tons of money on the front end for physical servers and saved money on the back end by eliminating teams of engineers to now only a few engineers that maintain the smaller number of physical servers. In the event of cutting down engineers that maintain and monitor physical servers, the industry changing also increased the need for professionals to learn about virtual technology with architecture, design, and support.  Information Technology is an ever changing and developing industry in which professionals constantly must learn because there are constant updates and inventions that quickly make every day utilized technology obsolete. To stay up-to-date and technologically advanced, professionals must constantly study and learn better ways of utilizing technology to be expert providers.

Even though virtual technology is utilizing physical equipment to maximize and expound servers, it has graduated into virtual desktops. Virtual desktops allow users to go from one computer to another computer and have access to all their data and desktop as if they never left it. They simply need to user their secure logon ID to access their data on a completely different computer. All of this is stemmed from virtual technology. Now with all these advancements in virtual technology from virtual servers to virtual desktops, where is the next unconquered avenue for virtual technology to go? Well this is where the “Cloud” has come into play. The “Cloud” is data centers which we referred to earlier in this article. The” Cloud” is where companies can now eliminate their on-premise servers and have all their data stored at data centers. The businesses don’t own the data centers, nor do they own the servers, but they pay a price to companies that do own the data centers and the physical servers. One of the dominant companies which has taken advantage of the move to the “Cloud” is Microsoft.

Microsoft has well over 42 different data centers around the world and close to a million physical servers as they expand their cloud grasp on the world. Microsoft Office 365 is their flag ship which started by allowing businesses and government agencies to get rid of their physical exchange servers, cut down on the need to keep exchange architects, and allow users to access their mail data from anywhere and on any device securely. This quickly began a wave of movement of companies rapidly signing up for Office 365. The next aspect of their “Cloud” came Azure which allows the use and accessibility of Active Directory in virtually.  The other aspects of the Microsoft Cloud came InTune for MDM (Mobile Device Management), and RMS (Rights Management Suite). It wasn’t long before Microsoft decided to bundle all these products together under EMS (Enterprise Mobility Suite). This comprehensive product currently encompasses the largest products that are completely virtual saving companies millions and billions of dollars while drastically increasing productivity. This is an easy decision which is why government agencies have been signing up by the boatload for the Microsoft Government Cloud space. The stigma of security concerns over the crowd is gradually weaning away as Microsoft has shown that their “Cloud” product not only complies with security requirements of companies and government organizations, but it surpasses it.  Microsoft allows for users to utilize both on-premise and “Cloud” technology in hybrid modes that allow companies and agencies to still utilize their current equipment while still having the ability to access their data via the Microsoft Cloud.

It’s clear the “Cloud” is not going anywhere anytime soon, but in fact is the platform for even greater advancements in virtual technology to come. IT GURUS OF ATLANTA is a trusted Microsoft Partner which is certified to service clients wishing to advance to the Microsoft Cloud space. We are also paired with most major manufacturers as a reseller to provide computer hardware solutions to companies, government agencies, and data centers. We pride ourselves as a low cost, thoroughly experienced, certified, and a quantifiable company.  IT GURUS OF ATLANTA delivers services and products that exceed the current standards while proving to be a one-stop IT service and hardware provider.

We hope you enjoyed our journey in learning about the “Cloud”, where it is headed, where it is currently, and how to get the most out of this growing technology.

 

IT GURUS OF ATLANTA LLC

“All Your Company Information Technology Needs Under One Company”

3355 Lenox Road Atlanta, GA 30326

www.itgurusatl.comcustomerservice@itgurusatl.com|

(888) 511-0143 OR (706) 406-5914

“As a registered SAM.gov company, we service government entities across the entire US and Canada”

“LIKE” us on Facebook at: http://www.facebook.com/itgurusatl / “FOLLOW US” on Twitter: http://www.twitter.com/itgurusatl / “CONNECT” with us on LinkedIn: http://www.linkedin.com/in/itgurusofatlanta/

SUBSCRIBE TO OUR NEWSFEED AND UPDATES AT: http://eepurl.com/cU_t7r

 


Call Now To Get Started!

(888) 511-0143 or (706) 406-5914

Contact Us

Contact Form

Fields marked with an * are required