Search the Community
Showing results for tags 'tech'.
-
Microsoft has a new adapter that lets Xbox One owners use their wireless controllers to play games on Windows 10 PCs and tablets. http://cnet4.cbsistatic.com/hub/i/r/2015/10/12/f345a5dd-2682-4499-a517-8ad66fd1a5b3/resize/570xauto/9b0c9214bfa6442ae52fd975319631fe/xbox-one-wireless-adapter.jpg Available starting October 20, the adapter will sell for $25. The USB-based adapter plugs into a Windows 10 computer or tablet and connects with the Xbox One wireless controller. From there, you can use the wireless controller to play PC games and Xbox games that are streamed to a Windows 10 device. Sorry, Xbox 360 owners, the adapter supports only the Xbox One. The adapter is a small but crucial piece of Microsoft's goal to unite PCs, tablets and the Xbox One video game console under the banner of Windows 10. Officially released on July 29, Windows 10 is Microsoft's attempt to draw as many users as possible to its operating system. One way of doing that is to bridge the PC and gaming worlds. The Xbox app for Windows 10 lets you stream games from the console to other devices, which may convince some Xbox One gamers to run the new OS on their PCs and tablets. The Xbox One is also part of Microsoft's "universal apps" strategy, which lets developers create games and apps for one platform and then tweak them to run on another platform. For example, a developer could create a game once using core programming code and then easily modify it to run on a Windows 10 PC, a Windows 10 mobile device and on the Xbox One. The new adapter and the Xbox One controller will support games designed for Windows 10. To generate interest among gamers, Microsoft has released several games optimized just for Windows 10, including Minecraft, Gigantic, Killer Instinct and Gears of War. Source http://www.cnet.com/
-
http://ec0c5a7f741a6f3bff65-dd07187202f57fa404a8f047da2bcff5.r85.cf1.rackcdn.com/images/8KVH3ucBd746.878x0.Z-Z96KYq.jpg How many features can you pack into the restrictive confines of a mini-ITX motherboard? Whatever the answer, Asus is coming close to it with the Maximus VIII Impact, which it announced at an event in San Francisco on Friday. The tiny motherboard is densely packed with an impressive feature list, including some capabilities that you won’t even find on midrange ATX motherboards. As a member of Asus’ high-end ROG Gaming line, though, the Maximus VIII doesn’t come cheap. This is a premium $250 board in a damn miniscule package. As you may be able to tell from the image above, the Maximus VIII Impact includes a built-in wi-fi adapter (802.11ac). It has a hefty VRM (voltage regulator module) mounted vertically at the top of the board, which offers some serious overclocking potential for both CPU and RAM. According to an Asus rep I spoke to at the event, they were able to push a Skylake CPU to its limits in the high 4GHz range on the Maximus VIII Impact. Also included: USB 3.1 Type-A and Type-C, an on-board sound card rather than a small sound chip, and a fan extension card for multiple fan inputs (which would be useful in a versatile mini-ITX case like the Cougar QBX). In addition to its four SATA ports, the Maximus VIII Impact includes the recently-named U.2 connector, which Intel has used for its enterprise 750 SSD. The U.2 connector isn’t nearly as well-known as the up-and-coming M.2, but it could end up being the connector of choice for the fast solid state storage of the future, with included support for x4 PCIe speeds. If U.2 catches on, the Maximus VIII will definitely be a future-proof motherboard. http://e5c351ecddc2f880ef72-57d6ff1fc59ab172ec418789d348b0c1.r69.cf1.rackcdn.com/images/PzY5b3Krc9-A.878x0.Z-Z96KYq.jpg Source http://www.pcgamer.com/
-
Dell has redesigned the XPS 15 laptop, and it’s available today starting at $1,000 for the base model, and up to $1,700 if you want something with a little more power.Dell has added its edge-to-edge "InfinityEdge" display to the XPS 15, much like the display in its popular XPS 13, and it’s looking extra nice. You’ll have the option of a 1920x1080 resolution or the 3840x2160 4K touch display. Another cutting edge inclusion: a USB 3.1 Type-C port (also compatible with Thunderbolt 3).An added bonus of the narrow bezel is that the 15.6-inch display now fits roughly into the size of an average 14-inch laptop. Its dimensions are 11-17mm x 357mm x 235mm, and it weighs 3.9 lbs for the non-touch, and 4.4lbs for the touch version.http://ec0c5a7f741a6f3bff65-dd07187202f57fa404a8f047da2bcff5.r85.cf1.rackcdn.com/images/hWfSuKoZV5ci.878x0.Z-Z96KYq.jpgDell has also upgraded to Skylake processors, letting you choose between the 2.7 GHz Core i3-6100G, the quad-core 3.2 GHz Core i5-6300HQ, and the quad-core 3.5 GHz i7-6700HQ. As for graphics, you’ll be choosing between Intel HD Graphics 530 and an Nvidia GeForce GTX 960M. The latter isn’t the most powerful card out there, but it's capable of good 1080p gaming performance—it's roughly equivalent to a desktop 750 Ti.8GB DDR4 memory at 2133 MHz comes with most of the options, with 16GB on the most expensive i7 model, but you can upgrade up to 32GB if you want. For storage, you’ve got the options of a 500GB HDD + 32GB Flash, or 1TB HDD + 32GB Flash. Solid State Drives are also available, with 256GB, 512GB, and 1TB PCIe options.Dell claims the XPS 15 should be able to get up to 16 hours of battery life, depending on configuration. Expect to get less than that when you're gaming, of course. Source http://www.pcgamer.com/
-
http://cnet3.cbsistatic.com/hub/i/2015/10/05/9c7ababa-bb5b-41bf-bc9b-ee3219825a55/feccd4e07e9eb36dd166f6131998247e/bud-light-e-fridge.jpg A number of unsuccessful smart refrigerators from manufacturers like Samsung and LG over the years. And now Bud Light is trying its hand at connected cooling with a new app-enabled beer fridge that has Wi-Fi built right in. Dubbed the "Bud-E" (and slathered on all sides in Bud Light-blue branding), the new smart mini-fridge will be available online at Bud Light's website for $600. But first there'll be an initial test run in California where the price will be discounted by half, to $300. The fridge underneath that coat of bright blue paint is the Linq IQ, developed by Buzz Products, a design firm with offices in the United States, Europe, Asia, and Australia. The Linq IQ is licensed exclusively to Anheuser Busch, so for now it'll only come in the Bud Light-themed design. Or, if you're in Canada, you can get it in a red Budweiser finish. Source http://www.cnet.com/
-
Many people where disappointed when the Lg G4 specs where leaked with the information that it would be equipped with a Snapdragon 808 chipset. However the device performed amazingly fast and in par with the flagship device with the much powerful octa core 810 chipset. The G4 being an hit Lg has gained the trust of its buyers with it high standard products and is ready to manufacture its new flagship device. Whose specifications where recently leaked online. http://4.bp.blogspot.com/-5lWD2EW071U/VghseLjYncI/AAAAAAAAJ0g/A1B5F2dB32Y/s400/ap_resize.jpg Lg Flagship Devices are mostly known for its camera features, high resolution screen and smooth functionality. The Lg G5 continuing these features is going to be equipped with a 20 MP 1/2″ Sony camera sensor which is going to provide high quality images and combined with the laser focus technology and LG manual camera interface. Thus if you are into photography the G5 can be a smartphone of your choice. This time Lg has decided to go with the flagship processor from Qualcomm the Snapdragon 820 as this time they would be manufactured by Qualcomm itself instead of using the ARM’s Big little cores the inhouse Kyro cores Click Here to read more about it. According to Qualcomm the new 820 chipset wont go through any heating and under clocking issues like the 810. The 820 chipset also comes with upgraded DSP( Digital Signal Processor) and an Adreno 530 GPU. The G5 is expected to launch next year most probably at the end of the first quarter. Source http://www.unrevealtech.com/
-
You would think that, because Windows 10 comes with Microsoft Edge preinstalled, you'd be able to ditch Internet Explorer. But you'd be wrong. Because Internet Explorer 11 comes preinstalled on Windows 10 -- and no, you can't uninstall it. But you can turn it off. This Control Panel hack works in previous versions of Windows, too. Here's how to do it:http://cnet2.cbsistatic.com/hub/i/r/2015/09/24/cd7c8cb6-4dc6-46de-bcd1-28dbb2e6e224/resize/370xauto/f471ca4c85c769fab11f4bc4f2b69719/1-open-control-panel.png 1. Right-click the Start menu icon and click Control Panel to open the Control Panel. http://cnet3.cbsistatic.com/hub/i/r/2015/09/24/6f30b9dd-1d68-4b94-a7a2-c571267baf45/resize/770x578/b46f24e9ce12ce8d4d9b942730c2af5b/1-programs-and-features.png 2. If your Control Panel is in Category view (look at the upper right corner, you should see View by: followed by Category, Large icons, or Small icons), go to Programs > Programs and Features. If your Control Panel is in Large icons or Small icons view, go to Programs and Features.http://cnet1.cbsistatic.com/hub/i/r/2015/09/24/5e1fa46e-db75-4a2c-a360-58b268740d62/resize/770x578/2068aa1bb8422e1bf34079fd040c6ae7/1-turn-off-windows-features.png3. On the left side of the Programs and Features window, you should see a link with a blue and yellow shield next to it that says Turn Windows features on or off. Click this link to open the Windows Features window.http://cnet4.cbsistatic.com/hub/i/r/2015/09/24/4b1dad0c-c75e-420f-9a8c-48ad1c2f3185/resize/770x578/172b8c9ec5bde9462705d71ae2fa8ca9/1-windows-features-window.png4. In the Windows Features window, find Internet Explorer 11 and uncheck the box next to it. A warning window will pop up notifying you that turning off Internet Explorer 11 might affect other Windows features and programs -- click Yes to continue. Click OK.http://cnet1.cbsistatic.com/hub/i/r/2015/09/24/148cc663-ce84-4212-a2c2-85e1a03f9caf/resize/770x578/1a82617f1b6587124e0264918ae8097b/1-restart-or-dont-restart.png5. Once Windows turns off Internet Explorer 11, it will ask you to reboot your PC. You can choose to either Restart now or Don't restart, in which case the changes will be made when you restart your computer in the future. Source http://www.cnet.com/
-
http://www.extremetech.com/wp-content/uploads/2015/09/BMW-X3-2015-640x353.jpg Fallout from the Volkswagen diesel scandal continues to grow, as this morning brings reports that Audi’s head of research and development Ulrich Hackenberg and Porsche’s engine chief Wolfgang Hatz are both out — two of the top engineering figures in the Volkswagen’s other flagship brands. That comes as a bit of a surprise, even though it’s widely expected more heads will roll at VW aside from Martin Winterkorn, the CEO of the company’s worldwide operations, who resigned effective yesterday. Meanwhile, there’s word from the German newspaper Auto Bild that BMW’s diesel engines were also “significantly” exceeding regulatory limits, CNBC reports, with the BMW X3 2.0-liter diesel model spitting out 11 times more nitrogen oxide than the current level set by the European Union. “[We did not] manipulate or rig any emissions tests. We observe the legal requirements in each country and adheres to all local testing requirements,” BMW said in a statement in response to the allegations. “When it comes to our vehicles, there is no difference in the treatment of exhaust emissions whether they are on rollers (e.g. test bench situation) or on the road…We are not familiar with the test mentioned by Auto Bild concerning the emissions of a BMW X3 during a road test. No specific details of the test have yet been provided and therefore we cannot explain these results.” http://www.extremetech.com/wp-content/uploads/2015/09/2012_audi_a3_wagon_20-tdi-premium_s_oem_1_1600-640x426.jpgLate last week, news of the diesel scandal broke as VW admitted to cheating on emissions tests with the use of a software-based defeat device on almost 500,000 of its 2.0-liter “clean diesel” TDI engines sold in versions of the Jetta, Jetta SportWagen, Golf, new Golf SportWagen, Passat, and Beetle, as well as the Audi A3, since the 2009 model year. The scandal encompasses both the 140-horsepower, 236 lb-ft-of-torque blocks and the newer 150-hp engines released beginning with 2015 models. Later VW admitted that the software is actually installed on over 11 million vehicles globally. While diesel engine sales account for less than one percent of the passenger car market in the US, it had been growing, and it composes much more of the European car market thanks to higher fuel prices, looser emissions standards, and widespread fudging — reports abound of car manufacturers taping the doors shut and folding in mirrors to improve aerodynamics during tests, for example. It’s already known that VW has cheated in both the US and Europe. But the fact that its other brands may be exposed in the same manner, and that other auto manufacturers like BMW may also be involved, improve the odds that diesel may soon be dead in cars in the US once more. Source http://www.extremetech.com/
-
http://4d663a369f9f03c3c61e-870e77779efd63f7bd6c2ee08d8cfae6.r2.cf1.rackcdn.com/images/zflneafbmjR5.878x0.Z-Z96KYq.jpg When Nvidia launched the 980M last October, it claimed that the laptop GPU could hit 75-80% of the performance of the desktop card. That was impressive compared to the mobile GPUs of a generation or two back, but not as impressive as what Nvidia’s done now: fit the entire, uncompromised desktop 980 GPU into gaming laptops. Starting early October with six laptops from MSI, Gigabyte, Asus and Clevo, the GTX 980 is making its way into laptops, with all 2048 CUDA cores intact. And it’s overclockable.To show off the full-on 980 doing its thing in laptops, Nvidia demoed several laptops side-by-side with desktop machines to compare benchmarks. In Shadow of Mordor, Tomb Raider, and a couple other benchmarks, the laptop system was able to turn in nearly identical scores—at worst, about 5% off what the desktop machine delivered. In those cases, it wasn’t even quite a fair fight, since a laptop CPU was up against a more powerful desktop CPU. Some of the laptops were actually equipped with desktop parts and delivered dead even performance. In one case, 3DMark actually turned in identical scores down to the point.The GTX 980 will be able to deliver 7 Gbps of memory bandwidth compared to the 980M’s 5 Gbps. Whereas the mobile 980s previously only had 3 phase PSUs, the new cards will be outfitted with 4-8 phase power supplies, which will vary by laptop. Every system that ships with a 980 will have a custom-tuned fan curve to keep the card cool, but it’ll also ship with some sort of overclocking tool, like MSI Afterburner or Gigabyte’s OC Guru.http://e5c351ecddc2f880ef72-57d6ff1fc59ab172ec418789d348b0c1.r69.cf1.rackcdn.com/images/OkZuj-UOrfJL.878x0.Z-Z96KYq.jpgHow overclockable the 980 will be will naturally vary from laptop to laptop, as different systems will have different thermal constraints. You can bet that the Asus GX700, that watercooled beast of a laptop we wrote about last week, will be able to push the 980 to its limits.All of this performance requires the laptop be plugged into AC power, however. Without the extra juice from the wall socket, the 980 will deliver roughly equal performance to the 980M.Here are the ones coming in the near future:-Asus GX700 -Clevo P775DM-G -Clevo P870DM-G -Gigabyte Auros RX7Y4 -MSI GT72 -MSI GT80 Nvidia also told that like with the 980M, there will be SLI configurations of the full-size 980 in laptops, too. That’s likely as much GPU muscle as you’re going to find in a laptop until sometime in 2016, when Nvidia has a new generation of cards to roll out. Source http://www.pcgamer.com/
-
Canon, the Japanese camera manufacturing giant recently launched there 250 Megapixel(19,580 x 12,600 pixels) cmos sensor in Tokyo at 7th September, 2015. The sensor is an APS-H-size (approx. 29.2 x 20.2 mm) has also set the world record for the maximum number of pixels in a Cmos sensor less that 35mm in size. http://2.bp.blogspot.com/-B6XtUwUZdnQ/Vfw-mGTAnTI/AAAAAAAAJyE/k6wqu9DLRpM/s640/s_95d04a63ef7640d696847569d7d01b7b.jpg Cannon also claims that the sensor can take images in which letters can be distinguished on the side of the plane, the location of the airplane being 18kms away from the sensor. High megapixel CMOS sensors generally face problems such as signal delays and slight discrepancies in timing due to high megapixel count and high signal volume. However, the new Cmos sensor with advancements in circuit miniaturization and enhanced signal-processing technology provides an ultra-high signal readout speed of 1.25 billion pixels per second despite of its exceptionally high pixel count. The camera Sensor can also record Ultra High resolution videos which are 125 times sharper then that of Full HD(1920x1080) and 30 times that of 4K (3,840 x 2,160 pixels) video, which lets users crop and magnify the video without sacrificing image resolution and clarity. According to Canon the technology wont be used in DSLR's and would be currently used for commercial purposes such as specialized surveillance and crime prevention tools, ultra-high-resolution measuring instruments and other industrial equipment, and the field of visual expression. Source http://www.unrevealtech.com/
-
DARPA wants to build a robotic waystation in Earth’s orbit http://www.extremetech.com/wp-content/uploads/2015/09/Space-640x360.jpg Everyone can see that when it comes to space, real progress is going to require some innovative new ideas. Maybe that will come in the form of a 100,000 kilometer ribbon of experimental nanotubes stretching all the way to geosynchronous orbit, or perhaps just an enormous, spinning spiral ramp. But any solution must give us a better ability to get to space and do work once we get there. Now, rumblings from DARPA and NASA show that they may be fantasizing about a new, semi-permanent installation in space — and they’re already working on the technology that could make it a reality. The idea is basically to create a construction, repairs, refueling, and mission restart hub, in space. Currently, all these functions require a return to base — the ISS receives shipments of supplies, it doesn’t generally dole them out. With such a station, NASA could imagine a new satellite design, pick a currently defunct old orbiter, and send up only those parts necessary to transform the old into the new. The solar panels, thrusters, and other time-tested hardware can stay intact, while computers and scientific instruments are swapped out by a series of robotic arms and manipulators. space station 2These arms are reportedly already in the works, and are souped up versions of the space shuttle’s original Canadarm. These would be capable of doing all the complex manipulation needed by an orbiting robot space mechanic. DARPA is already doing work on a mission called Project Phoenix, which looks to reuse the most valuable parts of old, dead satellites — it has also been working on grasper technology that could shear apart and potentially reassemble old space tech. In fact, this idea for a space-based repair station seems almost like a successor project to Phoenix, making its piecemeal efforts into an automated repair station. space-station-2.jpg http://www.extremetech.com/wp-content/uploads/2015/09/space-station-2.jpg Speaking at DARPA’s Wait What? conference (yes, that’s what it’s called) in St. Louis, former NASA astronaut Pam Melroy, now deputy director of DARPA’s Tactical Technology Office, said that some sort of orbital staging and upgrade station could change the way NASA deals with space. The ISS orbits at a messy 400 kilometers, well within “low” Earth orbit, meaning that a geosynchronous station would open up all sorts of new possibilities. She said that it could do for the Earth what the great port cities of yore did for Europe — leading to perhaps the first ever time I’ve hoped that Mars doesn’t have any indigenous inhabitants. The idea, as proposed, is to build this station in geosynchronous orbit, or around 36,000 kilometers above the surface. At this height, it could enter an orbit that would keep it directly above a specific spot on the Earth’s surface, but it’s also too high to enjoy any real protection from the Earth’s atmosphere or magnetic field — this hypothetical station would need to either be shielded in some all-new way or, more likely, be robotically controlled for the vast, vast majority of the time. Even with some sort of super-next-gen launch technology like a space elevator, it’s a certainty that on a long enough timeline, we’ll have to eventually stop building spaceships anywhere but in space. We’ll never be able to mine resources in a vacuum, but other than that there’s nothing about the ship building or maintaining process that has to be down on the surface; not that there was ever any doubt, but we now know that NASA and the US military are very aware of this fact. Source http://www.extremetech.com/
-
Acer's Predator gaming series of Pc's, highly powered laptops and Gaming monitors. Following the trend of the series Acer have recent launched there Gaming series Android smartphone named Predator 6 in Berlin at IFA 2015. http://4.bp.blogspot.com/-npI7McLlS4k/VenMReyh03I/AAAAAAAAJrA/9N8qtWVWj7o/s640/20150831SXUPZFWMHPWYIPVULOUQTSJF.jpg The Predator 6 features a deca-core(10 cores) MediaTek chipset namely the MT6797 more popularly known as the Helio X20, for top end performance it features two Cortex A72 cores clocked at 2.5GHz(which is amazingly high) supported by four Cortex A53 cores at 2.0Ghz. And for low end applications it features four more Cortex A53 cores (which accounts to a total of 10 cores) running at a speed of 1.4Ghz for prolonged battery life. However while Gaming, you would be mostly powered by the Cortex A72 and supported by the A53 cores clocked at 2.0Ghz. While the rest of the cores would work while performing regular tasks and are going to provide no performance increase as most of the octa and hexacore phones do, the lowered frequency cores are just used for increasing battery life. According to me boasting about the number of processors is just a marketing strategy, as the high end cores would acquire much more battery. The Smartphone houses four Speakers which are nicely wrapped up by four distinct orange grills at the top and bottom of the phone. As sound effects a lot in Gaming experience, thus embedding four speakers would probably provide an immersive Gaming experience. Coming to ports it consist of a 3.5mm jack on the top and an USB 2.0 mini port on the bottom of the phone.http://2.bp.blogspot.com/-5HNdAupqlsQ/VenMpZybjmI/AAAAAAAAJrI/kK9lK1V12bY/s640/gsmarena_003.jpgThe body of Gaming Smartphone looks like a Lamborghini and easily stands out in comparison to other smartphone and would probably attract most Gamers. The body is brushed and made up of a metal shell, edged in a rectangular shape with an awesome looking Predator logo at the back, which looks amazing and also provides amazing grip while Gaming and even better grip at Landscape mode. The Display of the device is a 6.0inch 1920x1080 full HD display which is powered by a Mali-T880 MP4 GPU and Supported by 4 Gigabytes of RAM. The Smartphone also features a 21 Megapixel rear camera. The phone is currently running on Android and features an Haptic Feedback feature for Gamers. Acer is allegedly suspected of also designing a top layer Gaming interface for the Predator series of mobile gaming devices which would run above Android such as Samsung's Touch wiz user interface. The price and release date of the device isn't yet confirmed. Source http://www.unrevealtech.com/
-
Nvidia launched there GTX 950 card last week at 150$ which was a pretty good deal, considering the price performance ratio and Nvidia's old offering 750ti in the price range which had been around since a long time and was getting a stiff competition from Amd's new launches. Colorful iGame have also designed there non-reference which is currently only available to testers and showed pretty amazing overclocking performance. http://4.bp.blogspot.com/-gSZLAl5s5Wo/VeDzuXKx6MI/AAAAAAAAJk4/e0dVIR7WRUc/s640/ChMkJlXaimGIRCbKAAFOgGDEKwUAAAe4gL_B7UAAU6Y561-635x397.jpg Specification- Colorful iGame GTX 950 OC is based on the GM-206-250-A1 Graphic core which features 768 CUDA Cores, 48 Texture Mapping Units and 32 Raster operation units. The card runs at 1140Mhz base and 1329Mhz Boost clock on the overclocked bios in comparison to 1026Mhz base and 1190Mhz boost clock of the reference bios version. The card consists of a 2GB GDDR5 memory which runs along a 128 bit interface and has a TDP of 90Watts. The card while running on the overclocked bios has a TDP of 110Watts which is 20Watts higher than the reference version. The card is available for a 20$ premium price of 179$ in comparison to 159$ of the reference version. Design-http://4.bp.blogspot.com/-whfsY9XkMxU/VeDzzDO7a2I/AAAAAAAAJlA/aF1qh-6XXgs/s640/mafia_3.0.0.jpg The card is cooled with dual fan cooler which consists of a small aluminium block and uses a single copper pipe to dispose of heat from the GPU block. The card comes with a standard back plate and uses a single 6pin powerconnector to draw in power. The card comes with a switch to switch between the overclocked and normal bios. Display ports consist of a DVI, HDMI and three display ports. Overclocking And Performance- At the overclocked bios setting the card performance close to the GTX 760 and 15% slower than the 960 and mostly in par with the Radeon 370. The card turns out to be a heavy overclocker, users managed to overclock the card to insane speed at 1506Mhz core clock and 2004.8Mhz Memory clock speed which is effectively 8.0Ghz clock speed without any voltage adjustments. At this oc configuration the card draws 171Watts power in comparison to 162Watts of the Overclocked bios and 154Watts of the reference bios. At this configuration the card is 2-5% faster than the GTX 760 and 7-10% slower than an overclocked GTX 960. At full load the temperatures didn't cross the 70C mark.http://3.bp.blogspot.com/-XJzWiHuf2CI/VeDxungIXzI/AAAAAAAAJkU/o5zTiY3bX2k/s640/mafia_3.0.0.jpg With the small increase in voltage of 1.2v users managed to overclock the card to insane speeds of 1600Mhz core clock and 2000Mhz Memory Clock Speeds. The temperatures in this configuration too didn't cross the 70C marks.http://4.bp.blogspot.com/-e7pk0uvp3iE/VeDx6Ecar5I/AAAAAAAAJkc/5fDddo1-7ZM/s640/mafia_3.0.0.jpg Source http://www.unrevealtech.com/
-
http://www.extremetech.com/wp-content/uploads/2015/03/XboxOne-640x353.jpg While Windows 10 isn’t officially available until the end of July, early adopters have been running the Insider Preview for months now. And as of a few days ago, those of us running the latest build can take advantage of the Xbox One game streaming functionality in the Windows 10 Xbox app. In a post on the Xbox Wire, Larry “Major Nelson” Hryb walks us through the process of enabling this shiny new feature. If you want to give it a go for yourself, start by heading to the Settings menu on the Xbox One. Under Preferences, you should be able to toggle on a setting titled Allow game streaming to other devices.http://www.extremetech.com/wp-content/uploads/2015/07/Xbox-App-640x353.png Next, make sure you’re running the latest Windows 10 build and the newest version of the Xbox application. Launch the app, navigate to Connect, and select Add a device. Provided you’re on the same network, you should be able to select your Xbox One from this menu. Plug in a controller, go to the Home tab, and then select your console under the Game Streaming section.http://www.extremetech.com/wp-content/uploads/2015/07/Remote-Play-640x363.jpeg Of course, Microsoft isn’t the first out of the gate for local game streaming. Sony allows you to stream PS4 games to the Vita and PlayStation TV over Remote Play, Valve offers in-home streaming in the Steam client, and even Nintendo offers off-TV play for many titles on the Wii U. It’s nice to see Microsoft finally taking advantage of the massive Windows market, but why did it take so long? I called out the Xbox One’s lack of game streaming over a year ago, and Microsoft is just now rolling out that functionality. We can add this new feature to the long list of improvements that Redmond has made in an attempt to right the ship after the initial debacle. The strict DRM, the focus on television, and the reliance on the Kinect were all missteps, but there were also countless small issues that needed to be corrected after the troubled reign of Don Mattrick. With the major price drops, the backwards compatibility announcement, and now the game streaming roll-out, Microsoft’s Phil Spencer has done a superb job revitalizing the Xbox One. But is it too late to undo the damage done to the Xbox name? The PS4 quickly took hold as the dominant platform this generation, and Microsoft has been playing catch-up ever since. Personally, game streaming is a must-have feature, so this update is a major selling point for me. I currently keep my PS4 in my home office, and frequently stream to the PlayStation TV in my bedroom. Now, the only thing keeping me from pulling the trigger on an Xbox One is a lack of notable exclusive games. But with Quantum Break and ReCore on the horizon, hopefully that will change in 2016. Source http://www.extremetech.com/
-
http://cdn.mos.techradar.com/art/TRBC/Generic/future-data-centre-470-75.jpgWhat is a storage bottleneck? And how can you avoid it? Thomas Pavel, EMEA Storage Sales Director at Avago Technologies, told us about the strains caused by the data deluge and how your organisation can avoid them. TechRadar Pro: What are the biggest challenges of the data deluge? Thomas Pavel: The volumes of published information and data continue to grow unabated, fuelled by demanding applications like business analytics, social media, video streaming and grid computing. Many organisations, regardless of their area of business, want insight from new and unstructured sources such as news reporting, web usage trends and social media chatter. The ability to access and retrieve data quickly is also a major factor contributing to business success and/or customer satisfaction. But there's a lot of data to handle. Just keeping up with this relentless growth and storing of data is challenge enough, but how to deal with such vast volumes of data cost-effectively? And perhaps most importantly: How to maintain or even improve storage performance? TRP: What is a storage bottleneck? Where and when do bottlenecks tend to occur? TP: As the volume of data increases, so too can the time it takes to access it. This is known as a 'bottleneck'. There are many potential locations for 'pain points' or bottlenecks in an enterprise system, so locating the bottleneck is not always simple. Addressing the bottleneck and maintaining performance is the rationale behind continual advances in storage technologies today. When designing storage systems for performance, it is essential to understand where the bottlenecks can occur. This is especially true given that the bottlenecks change with each new generation of technology along the data storage path. The three most critical elements that affect storage performance are the server's Peripheral Component Interconnect Express (PCIe®) bus, the SAS solution as implemented in host bus adapters (HBAs) and expanders, and the disk drives themselves, which can have either a SAS or a SATA interface. Storage bottlenecks migrate among the successive generations of the various technologies involved end-to-end. With the advent of third generation PCIe, for example, second generation SAS became the new storage bottleneck. Third generation SAS is now able to take full advantage of third generation PCIe's performance, making PCIe the new bottleneck in systems using 12Gb/s SAS. TRP: What guidelines can we use to maximise storage system performance? TP: When designing a storage system for high performance, it is necessary to understand the throughput limitations of each element. Critical applications must also scale easily over time while remaining both highly protected and easily manageable. SAS is now in its third generation, and the performance has doubled with each new generation from the original 3Gb/s to 6Gb/s and now 12Gb/s. SAS, like PCIe uses lanes and high performance storage systems normally aggregate multiple SAS lanes to support high data rates. TRP: Does the storage bottleneck change with different system configurations? TP: This table provides a summary of some sample configurations showing where the bottleneck exists when configured with a "full complement" of disks (the slowest element in the system). As shown, the need to support more disks (for capacity) requires the use of later generations of SAS and/or PCIe, and/or more SAS lanes. Looking at it another way, in systems with a small number of disks, their relatively low aggregate throughput becomes the bottleneck, so there is no need to "over-design" the configuration with later generation technologies and/or more SAS lanes. The disks referenced in the table example all have a 6Gb/s interface with a throughput of 230MB/s and 550MB/s for the 15K RPM HDDs and SSDs, respectively. Note that the table assumes all drives are operating at their maximum throughput simultaneously, and this does not always occur. It is also important to note that IOPs is often more critical than throughput in many applications today, depending on the circumstances. For these reasons, each configuration is normally able to support many more disks than indicated. TRP: So how can SAS third generation improve performance for businesses? TP: Being able to move at 12Gb/s means that measurements of over one million IOPS can be achieved. 12Gb/s SAS is an evolutionary change and a big step forward for the market. For the first time IT managers will be able to exploit the full potential of PCIe 3.0. This in turn will benefit businesses that rely on mission-critical data in a variety of environments, including transactional databases, data mining, video streaming and editing. TRP: What are the issues in migrating to SAS third generation? TP: The primary issue in the migration to third generation SAS is a familiar one: investment protection. Most organisations have made a significant investment in SAS disks, and want to preserve that investment when migrating to 12Gb/s SAS technology. The problem is: The third generation SAS standard maintains backwards compatibility by throttling down to the slowest SAS data rate in the system. In small-scale point-to-point configurations, this is not always an issue because the migration would require upgrading both an Initiator and its Target. But in most organisations, such point-to-point configurations are rare. The system-level "slowest data rate" performance limitation, therefore, means that in organisations without point-to-point configurations would not be able achieve the 12Gb/s performance boost until all disks support this new standard. TRP: How can this issue be overcome? TP: Fortunately there is a way to overcome this limitation, and that requires understanding a little about how SAS expanders function. A SAS expander makes it possible for a single (or multiple) Initiator(s) to communicate with multiple Targets concurrently. Expanders help make SAS remarkably scalable, and because each is capable of supporting multiple disks, expanders also makes it possible to aggregate the throughput of those disks. http://rss.feedsportal.com/c/669/f/415085/s/3ebc6d72/sc/4/mf.gif http://da.feedsportal.com/r/208961151212/u/49/f/415085/c/669/s/3ebc6d72/sc/4/rc/1/rc.img http://da.feedsportal.com/r/208961151212/u/49/f/415085/c/669/s/3ebc6d72/sc/4/rc/2/rc.img http://da.feedsportal.com/r/208961151212/u/49/f/415085/c/669/s/3ebc6d72/sc/4/rc/3/rc.img http://da.feedsportal.com/r/208961151212/u/49/f/415085/c/669/s/3ebc6d72/sc/4/a2.imghttp://pi.feedsportal.com/r/208961151212/u/49/f/415085/c/669/s/3ebc6d72/sc/4/a2t.imghttp://feeds.feedburner.com/~r/techradar/software-news/~4/U3wZzVBiazY
-
http://cdn.mos.techradar.com/art/Watches/Samsung/Galaxy%20Gear%202/Gear2s_onbackground-470-75.jpgeBay owned PayPal was curiously left out of the impressive lineup of banking heavyweights during the Apple Pay launch last week, but a new report claims that may be because the payment giant is putting all of its eggs in Samsung's basket instead. Business Korea reported yesterday that Samsung Electronics plans to follow Apple down the smartwatch payment hole, and is said to be teaming up with one of the leading mobile payment services around to accomplish that goal. According to an unnamed "high-ranking official" at Samsung, one of the manufacturer's third-generation smartwatch devices will offer "simple payment functions" powered by PayPal, and protected by some form of "fingerprint identification technology." Ironically, PayPal - who publicly dissed Apple Pay only last week - is not even available in Samsung's native country of South Korea, although the smartwatch based service is expected to debut in 25 other countries, eventually expanding to more than 50 around the globe. Payment watchSamsung is reportedly eyeing early 2015 for the launch of its payment-based smartwatch, presumably using the annual Mobile World Congress event as a springboard for doing so. Perhaps not-so coincidentally, early next year is also the same timeframe Cupertino has already staked out for its own Apple Watch, which will be capable of making contactless payments even without being connected to a compatible iPhone. Biometric expert Synaptics will reportedly provide fingerprint verification technology for Samsung's future smartwatch, part of the company's Fast Identity Online Alliance which also includes PayPal, Bank of America, Visa and Google among its ranks. Synaptics Chief Executive Officer Richard Bergman confirmed that "wearable devices with fingerprint verification and relevant solutions will be released early next year," suggesting that Samsung and Apple won't be alone in duking it out for wearable payment domination. Find out if Apple truly goes big in our review of iPhone 6 Plus!http://rss.feedsportal.com/c/669/f/415085/s/3ea494b5/sc/5/mf.gif http://da.feedsportal.com/r/204367723914/u/49/f/415085/c/669/s/3ea494b5/sc/5/rc/1/rc.img http://da.feedsportal.com/r/204367723914/u/49/f/415085/c/669/s/3ea494b5/sc/5/rc/2/rc.img http://da.feedsportal.com/r/204367723914/u/49/f/415085/c/669/s/3ea494b5/sc/5/rc/3/rc.img http://da.feedsportal.com/r/204367723914/u/49/f/415085/c/669/s/3ea494b5/sc/5/a2.imghttp://pi.feedsportal.com/r/204367723914/u/49/f/415085/c/669/s/3ea494b5/sc/5/a2t.imghttp://feeds.feedburner.com/~r/techradar/software-news/~4/xig4EeIMEZk
-
http://cdn.mos.techradar.com/art/features/World-changing%20tech/FMU256.reg_contents.atmosphere-470-75.jpgBig data is an integral part of the information strategy of many businesses, driving operational efficiencies and competitive advantage. For organisations wanting to leverage big data for business advantage, its volume, variety and velocity presents a complex mix of challenge and opportunity. One of the most compelling opportunities for businesses is to gain added value by analysing big data in a geographic context. The emergence of location analytics tools is driving the ability to discover location-based patterns and relationships from data that may exist in disparate places, streams or web logs. It is also enabling organisations to visualise and analyse big data to reveal previously hidden patterns. Here, I provide tips outlining how your business, whatever its size and type, can get the most out of a location analytics implementation. Maximising your analytics strategy1. If you want to see the story behind your data, bring together maps with multiple data layers. You can combine big data with maps to optimise your information assets. Retailers can see where promotions are most effective and where the competition is. Credit card companies can map data from transactional systems, customer information and social media, to build profiles of card users to help shape outbound marketing strategies. Climate change scientists can combine data with maps to see the impact of shifting weather patterns. By visualising data on maps, businesses across all sectors can start to realise previously hidden treasures. 2. Use location analytics to interrogate your data in real time. Spatially-enabled data on a map allows you to answer questions and ask new ones. Where are disease outbreaks occurring? Where is the insurance risk greatest given recently-updated data? Taken one step further, by integrating social media data into these maps, you can now track dynamic behaviour and sentiment in real-time. There are huge potential benefits. Big data technologies provide access to unstructured machine-generated, web-generated and NoSQL data. Map visualisation and spatial analysis on this data can reveal patterns and trends that are beyond the capabilities of traditional databases, spreadsheets and files. 3. If you are looking to use mapping to harness large data volumes more efficiently, learn from the experience of others. Now, with the emergence of GIS tools for big data processing frameworks like Hadoop, analysis and predictive modelling can be carried out on massive data sets to gain unrivalled insight. Governments can use it to design disaster response plans. Health service organisations can model the potential spread of a disease and identify strategies to contain it. The Energy Saving Trust works with the Government, local authorities and commercial organisations to help them improve energy efficiency initiatives and reduce fuel poverty. They combine information from big data sources including open data, demographic and solar potential to identify housing that is suitable for specific energy-saving measures. 4. Adopt a rounded approach to data Big data is not just about a mass of data, it's about an approach to working with data, the location analytics tools required to work with it, and derive business value. The Aberdeen Group's recent report "Location Analytics: Putting the Evolution of BI on the Map" revealed that organisations with data visualisation tools can access timely information 86% of the time, compared to 67% of the time for those without visualisation tools. This means that BI users without data visualisation have to make twice as many decisions based on gut instinct or incomplete, outdated information. By bringing together big data and mapping, organisations can tap into a raft of benefits. They can drive faster time to market by highlighting previously unseen patterns within existing data sets. They can bring together different technologies like BI and CRM with Location Analytics to deliver enhanced business insight. And the combination of big data and mapping can lead to more accurate decision-making as well as delivering enhanced customer engagement, improved profitability and greater competitive edge. Sharon Grufferty is head of SaaS product management at Esri UK. She is responsible for product and content strategy across new public and private sector marketshttp://rss.feedsportal.com/c/669/f/415085/s/3e96f2e9/sc/4/mf.gif http://da.feedsportal.com/r/204367799404/u/49/f/415085/c/669/s/3e96f2e9/sc/4/rc/1/rc.img http://da.feedsportal.com/r/204367799404/u/49/f/415085/c/669/s/3e96f2e9/sc/4/rc/2/rc.img http://da.feedsportal.com/r/204367799404/u/49/f/415085/c/669/s/3e96f2e9/sc/4/rc/3/rc.img http://da.feedsportal.com/r/204367799404/u/49/f/415085/c/669/s/3e96f2e9/sc/4/a2.imghttp://pi.feedsportal.com/r/204367799404/u/49/f/415085/c/669/s/3e96f2e9/sc/4/a2t.imghttp://feeds.feedburner.com/~r/techradar/software-news/~4/YjGHAQ9b3x8
-
http://cdn.mos.techradar.com/art/TRBC/Abstract/Cyber%20lock/iStock_000020317880Small-Henrik5000-470-75.jpgIn today's business environment, there are several different challenges when it comes to sharing sensitive information. We spoke to Andrew Holmes, Director, Desktop at Nitro, in order to discuss the state of document security and the rise of Rights Management products. TechRadar Pro: When collaborating and sharing documents, what are some common mistakes people make that can potentially compromise sensitive information? Andrew Holmes: In today's business environment, there are several different challenges regarding sharing sensitive information. To start with, sharing doesn't just happen internally between employees, but also externally with partners, vendors and customers, which presents added risk. It's hard to control what happens to your document once you share it, and to know where it might end up if it gets forwarded intentionally or unintentionally. Without the latest document security solutions and tools such as RMS, anyone can still store documents on USB thumb drives, smartphones or other external devices that are not protected and might be easily accessible to bad actors. Another challenge is that some collaboration tools adopted by IT can actually be too locked down and prevent external sharing, which ends up being counterproductive. Employees will inevitably seek out workarounds in order to stay productive, i.e. downloading a file from the collaboration tool and then emailing it out. Additional control measures that persist with the content – like RMS – can minimise these risks because they enforce who can access the sensitive information, and what actions they can take on a document (e.g. viewing, printing, modifying, etc.). TRP: What poses a greater information protection risk to organisations – outside hackers or insufficient IT safeguards and careless employees? AH: While hackers certainly pose a threat, it can surprisingly be an organisation's own internal employees (often times without ill intention) who introduce the most risk of exposing sensitive information. Many employees still attach documents (primarily PDF files) via email because it is convenient for sharing and they're not aware of more secure solutions. The best defence is a good offence, and that begins with having a knowledgeable, well-trained workforce. In addition to establishing best practices and educating employees, IT needs to provide tools that are easy to use. The challenge is finding the right balance between providing tools to help enable security, while meeting usability needs. Of course, the proliferation of BYOD (Bring Your Own Device) has added another layer of complexity, and organisations must deal with the implications. One of the worst things IT can do is provide document security tools that are too cumbersome for people to use. It's important to offer something that integrates into employees' daily routines with little to no learning curve. And as cloud and BYOD adoption continues to accelerate, implementing policies and managing technologies will require a more detailed action plan. TRP: Why is RMS seeing growth in a wide range of industries, from manufacturing and aerospace to banking and telecommunications? AH: Sharing and collaborating is an inevitable part of every industry, and all organisations also have confidential data they are concerned with. For example, in banking and financial services, employees manage organisational balance sheets and income statements containing information such as revenue, account numbers and client contact information. In addition, they may handle sensitive documents related to mergers and acquisitions that could seriously impact revenue or competitive strategies if accessed by unintended parties. And high-tech manufacturing companies often share confidential product schematics, technical diagrams and other intellectual property with venture capitalists. TRP: Are there any limitations to RMS? AH: We are working closely with Microsoft to establish agreed upon standards that would enable cross-product RMS compatibility. However, currently both the document owner and their recipient have to use the same application (i.e. Nitro Pro) in order to view RMS-protected files. TRP: Can you explain the process involved in securing documents using a Rights Management product? AH: In Nitro Pro, this process involves only a few clicks of the mouse. You begin by simply clicking the 'Microsoft Security' button under the 'Protect' tab. From there, you add the recipient's corporate email address (the email needs to have a corporate domain) and then select the level of permissions. These permissions cover everything from editing and printing a document, to allowing comments and text copying. TRP: How do different Rights Management products in the market compare? AH: In principle, different rights management products work similarly to encrypt and manage access to documents, allowing the sender to maintain control of source files at all times. They allow users to manage documents and users from anywhere, authenticate every open or print attempt, and delegate user and document access. Some of these products are format specific (e.g. PDF), and others such as RMS work with all file formats (including Microsoft Office). Nitro Pro is already a natural complement to Microsoft Office for managing documents, and since Microsoft RMS is an integral part of the Office suite across all file formats, the two products offer a powerful solution. Rights management pricing differs across vendors, some of which require a minimum number of users (such as Adobe). Microsoft is much more flexible, especially with SMBs that need smaller or individual subscription pricing options. About Andrew Holmes Andrew joined Nitro in 2011 as QA Manager before rising to become Director of Desktop. Nitro Pro 9 has been his proudest achievement so far – the first desktop application integrated with Nitro Cloud. http://rss.feedsportal.com/c/669/f/415085/s/3e957b92/sc/4/mf.gif http://da.feedsportal.com/r/204367743191/u/49/f/415085/c/669/s/3e957b92/sc/4/rc/1/rc.img http://da.feedsportal.com/r/204367743191/u/49/f/415085/c/669/s/3e957b92/sc/4/rc/2/rc.img http://da.feedsportal.com/r/204367743191/u/49/f/415085/c/669/s/3e957b92/sc/4/rc/3/rc.img http://da.feedsportal.com/r/204367743191/u/49/f/415085/c/669/s/3e957b92/sc/4/a2.imghttp://pi.feedsportal.com/r/204367743191/u/49/f/415085/c/669/s/3e957b92/sc/4/a2t.imghttp://feeds.feedburner.com/~r/techradar/software-news/~4/CDEUcVrllD0
-
http://cdn.mos.techradar.com/art/cloud_services/Fuzebox/fuzebox-470-75.JPGThe origin of DevOps is broadly attributed to Patrick Debois five or so years ago - you can learn more about the . DevOps runs the risk of becoming a word without meaning as the industry rallies to attribute it to every product and service in their portfolio. The reality is that I can't sell you DevOps, and you can't buy it.DevOps is more than simply automation. It includes things like culture. It's a way of working that values collaboration with a shared view of success. As the name implies, this collaboration is primarily between developers and operators, but it is not restricted to that. DevOps means that you see the end to end delivery system from concept to production, your DevOps scope is any group that are involved in that workflow. So if you're willing to make a cultural investment in changing how you work, then you can start to implement some of the working practices that DevOps recommends. Integrating DevOpsBut before you jump head first into identifying work and managing constraints, ensure you know why you want to make DevOps part of how you succeed at IT services. Many successful implementations have been driven by a desire to: • Reduce time to market for new products and features • Build more agility to adapt to internal and external influences • Unlock cost savings offered by cloud platforms • Eliminate the risk of shadow IT services Successful DevOps adoptions are easy to spot. They are the organisations that talk about making tens and even hundreds of code releases into production every day, where there seems to be a constant flow of new features to keep users engaged and loyal. The leaders here are internet giants like Google, Facebook, Netflix, Etsy and more. But whilst many of these grab headlines, there are large numbers of success stories where organisations require internet and web services as a significant portion of their go to market channel, examples are Spotify, thetrainline.com, even Rackspace. Is your organisation a good fit?From the eclectic list above it should be apparent that DevOps is not an exclusive club, anyone can pick it up and try to make positive changes in their business. However, there are some common traits that identify the strong fit candidates for change: • Existing strong culture of collaboration and open communication • An executive leadership team who sees IT as a business enabler • Sponsorship from a high enough level to allow challenges to status quo • Revenue and brand derived substantially from web and mobile channels • Established organisations who view internet start-ups as a threat • Start-ups who want to be more nimble than their established competition • Applications that are self-built and developed on open technology All of these provide compelling business drivers to embrace change as an opportunity to create an advantage in your industry. But the transformation will be hard at times and leaders who are sponsoring these initiatives need patience and clear expectations to ensure the team are given every opportunity to succeed. As with cloud transformations, DevOps transformations will not be universally successful. In fact you should expect to fail at times. The trick is to fail fast, learn and repeat. Removing fear of failure is deep rooted in culture where incident reviews are a blame session rather than a learning outcome. This is why it is vital that you must commit to the culture as well as the working practices. However, the risk of not addressing DevOps does carry risks itself. If your competition gets faster to market with features and products, what does this mean for your business? If your competition can open up financial advantages by increasing operational efficiency without impacting margins, how will you react? DevOps adoption should not be driven by fear, but not making it part of your IT plans should be a well-managed risk in your organisation's strategy. Where to startSo hopefully, you've reached the point where you're interested in getting started. Here are some prompting questions to get you going: • Which application am I going to build this model around? Don't do this wholesale across your IT estate. The application and teams around them are key factors in success, identify a candidate application and create a bubble of autonomy around it. • Can I draw out the entire process from idea to production release? Simply drawing out your processes is a great way to identify stakeholders and eliminate waste. Never assume you have all the right people in the room in the first meeting! • Where are my constraints? The key to increasing flow through a system is to manage constraints. Creating capacity either side without addressing the bottleneck will not increase output. Work out which parts of your system slow you down and improve them. • Do I have the skills to execute this? DevOps skill sets are in high demand right now. Take time to understand if you have the right resources to succeed? If you don't, make plans for additional hiring, training, or look to introduce a third party who can help offload some of the core functions. There is a huge amount of reference material online. DevOps as an approach values sharing, many organisations have published materials about their successes and failures for you to learn from. Chris Jackson is the Chief Technologist for Rackspace EMEA.http://rss.feedsportal.com/c/669/f/415085/s/3e96575a/sc/4/mf.gifhttp://feeds.feedburner.com/~r/techradar/software-news/~4/rbKpAcRslZM
-
The Reboot Realm is a forum for anyone with an appreciation for past, present and future technology, whether that be gaming, PCs or mobile phones. To find out more, join our community today at http://www.rebootrealm.com We hope to hear from you soon! Notice: This ad has been placed under the assumption that it is within forum guidelines, if this is not the case, please contact john[at]rebootrealm.com
-
- gaming
- technology
- (and 6 more)