27 February 2014 by James Hayes
As if competing among themselves weren't taxing enough, the telecos have already seen millions - perhaps billions - in revenue switch away from old school text messaging toward low-cost or free messaging offered by arriviste Web-based rivals. But the launch of voice competitors based on a low-cost price model would not only represent a major challenge to the traditional telcos, it could also have a debilitating effect on global economies as they struggle to restabilise after years of recession. If telecos and their revenues are hit badly it will also knock the prospects of the many investors that have major holdings tied-up in them.
Last week's public forecasts from industry analyst Ovum indicate that telecoms operators will lose $386bn between by 2018 from customers using 'over-the-top' (OTT) VoIP solutions like Skype and Microsoft Lync. Its OTT VoIP Outlook: 2013-2018 report predicts traffic in the global the consumer OTT VoIP market can be expected to grow by (a CAGR of) 20 per cent between 2012 and 2018.
More recently some of these losses have been offset by increases in their broadband subscriber bases - but the increasing availability of uncharged Internet access in the form of free Internet access is having an effect on even that. And remember that it's not just revenue slows that the telcos have to contend with in terms of juggling their finances. Operating licences and spectrum auctions also put a drain on fund reserves; and OTT/VoIP services providers aren't based on traditional, much-regulated mobile networks, so don't have to shell-out for operating licences and the like.
The independent telcos have been a fairly good bet for investors for nigh-on 30 years - and have appealed to both small shareholders and large institutional fund managers - because, until recently, it looked like they had an assured revenue model going forward. Their markets were growing - increasing numbers of subscribers were spending more time making mobile voice calls and signing-up for broadband services. That model is now under strain. Some of the telcos - such as BT with its BT Sport divergence - seem to realise this, others seem slower to adapt.
Economic pundits might argue that the investment should follow the action, and switch from the telcos to their arriviste rivals; but moving investments from solid-looking, well-established and regulated telcos that maintain broad portfolios comprised of international operations, to a five-year-old provider of a proprietary, cross-platform instant messaging subscription service for the consumer smartphone market, is not a like-for-like proposition - even with Facebook fondling the purse strings.
Consumers may already be gleefully anticipating the savings that the advent of cheap/free calls will bring them, but the fact remains that the damage this might cause to the traditional telecoms industry could have a significant disruptive effect on national economies. The unpredictable factor in this scenario is the enterprise communications market, and whether MLEs - medium-to-large companies - will be willing to place their mission-critical communications requirement in the care of services providers primarily oriented toward the wayward whims of the consumer market.
Edited: 28 February 2014 at 12:29 PM by James Hayes
Shared IT model does not share risks
23 September 2013 by James Hayes
banks and other financial institutions are not only duplicating their own infrastructures when it would be more cost-effective to share resources, but also that operating over fully-shared platforms would make collaboration (such as account moves) easier, and also help to detect fraudsters and other nefarious agents who may be moving money between accounts for criminal purposes, dodgily applying for multiple loans, or engaged in money laundering. The concept is sometimes dubbed 'bundled infrastructure'.
Friday's FT report reckons that bank supremoes are concerned about the fact that IT expenditure has been rising at a time when banks face greater deficits on their reserves as a result of regulatory changes and falling revenues. The traditional model of IT deployment has changed in recent years, and banks no longer need to pursue a 'build it, own it' approach to provisioning their computer system requirements. It quotes 'a top executive at a large European bank' as saying that banks should stop allowing their IT departments to "build empires".
The executive adds: "Why do we need to spend billions on data centres and IT infrastructure? Most of us make things that are extremely duplicative."
When you get 'top executives' opining about IT strategy it's wise to wonder if they have taken the trouble to enquire if there may be reasons, indeed benefits, as to why respective banks have preferred to go it alone and retain largely enclosed, proprietary front- and back-office computer systems in an a age when other verticals are (supposedly) embracing the joys of cloud computing and outsourcing.
One area where it's been mooted that third-parties could be more involved is in data analysis. But allowing an external partner access to primary banking data entails a high degree of high risk. It throws up major issues around ensuring data protection and data integrity. No system is perfect, but enforcing governance on internal processes and personnel is bound to be as effectively as it gets; gaining the same guarantees from a third-party, for who ultimately the level is risk is profoundly lower, is not.
Then there is the question of information security: collapsing disparate IT systems would theoretically effectively remove physical barriers that exist when large systems belonging to respective parties are virtualised. Of course, in theory becomes of the massive interconnectedness that exists within the financial sector it is in theory possible for a hacker to exploit.
But the difference is that to do so they will have to come up against additional layers of physical security in the shape of firewalls, router security, and intercommunications checks - all of which would be obviated in a full-converged banking IT scenario.
IT departments across all commercial sectors have been under pressure to make budgetary savings in terms of capital expenditure and operational costs for many years, and the best IT practitioners do not like to work with technology that is not delivering optimal performance to the business - whether that's because it is over- or under-funded.
Edited: 23 September 2013 at 10:27 AM by James Hayes
GF1 crowd-funding is go!
9 September 2013 by James Hayes
an opportunity to have their own name given to a character in what's turned out to be the great man's last project before he died last December of Alzheimer's disease.
Titled Gemini Force One (GF1), the project is planned as starting a series of books aimed at younger readers based on concepts and stories were originally written by Anderson starting some five years ago. They tell the story of a secret organisation involved in rescues and disaster averting, including terrorist attacks.
To get the project completed Anderson's son Jamie has assigned author MG Harris to complete the first book in accordance with his father's instructions. The eventual aim for GF1 is for the books to be adapted into a new film or TV series.
The campaign, which opened yesterday, aims to raise £24,350 over 30 days - a fairly modest target for a Gerry Anderson project - Anderson once reckoned that each episode of Thunderbirds (ATV, 1965-1966) cost the equivalent of £1m to make (by 2005 values). There are some tasty rewards on offer for backers:
- A donor willing to stump-up £5,000 can have their name used as one of the main characters in a GF1 book.
- A set of five limited edition collectors postcards; four of which feature behind the scenes production photographs and screen tests - all from Gerry Anderson's personal archive.
- A limited number of reproductions of Gerry Anderson's business cards from AP Films, and its short-lived predecessor Pentagon Films, are also on offer.
Other backers will receive rewards such as signed editions, visits to Pinewood Studios (used as a locations for Anderson creations such as UFO and Space: 1999), and personal dedications in the books.
Anyone wishing to be Fully Advised and Briefed on how to make a donation should browse to:
Edited: 09 September 2013 at 04:46 PM by James Hayes
Overcoming orbital hazards...
22 February 2013 by James Hayes
Specialised radiation-hardened components can allow conventional satellites to operate for years without problems; but they are expensive, which adds significantly to the overall cost of development. A further expense comes from the need to ensure that circuits have back-ups if one is unfortunate enough to be damaged by radiation.
Many of the circuits that function happily at ground level are much more likely to fail, sometimes irreversibly, when flung into orbit and exposed to the harmful phenomena there to be found. Dealing with these damaging emanations have posed major challenges for satellite engineers.
One approach to ameliorating the effects of radiation is to use triple modular redundancy (TMR). Three sets of electronic circuits are used for each function, and vote on the output to weed out errors caused by stray alpha particles that may flip a control or memory bit.
This is expensive to do across the board; but as Chris Edwards reports in the current issue of E&T, Cubesats such as UKube-1 project - developed, built and tested for the UK Space Agency by Clyde Space of Glasgow - employ limited amounts of TMR to ensure critical subsystems, such as the Mission Interface Computer (MIC), can function after a failure.
Radiation may also have its practical uses up there. An EADS Astrium experiment to fly aboard UKube-1 will use the properties of radiation strikes to create random numbers for cryptographic applications. It is difficult to generate random numbers with sufficiently high entropy to defeat codebreakers. Spaceborne random number generators may provide an answer.'
Read more on this topic - 'The future of the satellite is cubed: meet the Jack-in-the-box that can carry your experiment into orbit on a realistic budget'
Big Data's 'bypass' dilemma
21 November 2012 by James Hayes
factory sensors, trading systems, and call detail records compiled by telecoms companies; a big multinational enterprise soon faces its own 'Big Data Andes' range to conquer.
One often overlooked area of Big Data management that developers are under increasing pressure to minimise is data trafficking. Because the processing of large data sets is often performed in a different location to where the information itself resides, it first has to be transmitted over a network connection. Bandwidth fluctuation can affect both application availability and performance - especially true in environments where privacy and data sovereignty concerns demand that sensitive data does not leave on-premise servers, and where many users at multiple sites collaborate to submit, process, or analyse big data sets or reports.
And where information cannot be readily fed into the various different Big Data system elements because the sheer volume of information involved swamps one or more of them, further bottlenecks on the ingress or processing side can bring the flow to a halt. It's a scenario that will be familiar to any old school network manager from the 1990s, a decade when data traffic volumes first started to outstrip infrastructural capacity in open-build premises LAN backbones.
Back in those days it was a matter of introducing bigger, more capacious intelligent hubs and using network management tools to better segregate traffic flows before evaluating the necessity to build-out more cabling infrastructure, preferably fibre-based.
Nowadays most organisations will probably not want to spend more ICT budget on installing supplementary premises cabling just for the periodical Big Data 'shunts' - building a sort of 'Big Data bypass', as it were - especially at a time when strategists are claiming that the compelling appeal of IP-based networking is that you can constrain costs by running multiple traffic stream types over the same hardwired infrastructure.
But this additional physical demand incursion is another factor in the Big Data challenge, and one that has particular poignancy for organisations who, for whatever reason, are not allowed to outsource their data centre operations.
Meanwhile, anyone with an interest in Big Data should attend the 'Big Data - Turning Big Challenges into Big Opportunities' Seminar, taking place on Wednesday 5 December at the Lion Court Conference Centre (just off Holbern) in central London (WC1V 6NY). This case-study driven seminar day - chaired by yours truly - will equip delegates with insider know-how as to how to implement big data strategies, and furnish you with contacts from across the big data industry to support your work going forward. Full details at
In short: the device divide
18 September 2012 by James Hayes
Jonathan Hunt, business development director at desktop and virtualisation firm Point to Point, about 'bring your own device' (BYOD), an ideology that could be transitioning from wishful thinking to market orthodoxy.
Generally-speaking, BYOD is a business policy of allowing/encouraging employees bringing personally- owned mobile devices such as laptops and smartphones to their place of work, and use them to access company resources including email, intranet, and company applications, alongside their personal data and applications. That the BYOD ethos can also have 'disruptive' impact on enterprise digital communications should not be overlooked.
Yet BYOD might also uncover pointers toward changes in the way people need to use information technology in mobile-connected working environments. The lowering price-tag of sophisticated mobile devices, plus the growing ubiquity of free, reliable public Wi-Fi connectivity, are also highly influential factors in encouraging the phenomenon. People no longer have to rely on their employer to meet the cost of a baseline-specification system that costs thousands. Corporate equipment refresh cycle lags sometimes meant that office workers had more sophisticated PCs at home than they did at work; but even so, some predict that BYOD will cause a bunch of operational issues that many boardrooms will find discomforting.
Such concerns will have to be allayed if pro-BYOD sloganising about capital expenditure reductions and staff training budget savings, as well as how staff are more likely to take care of devices containing sensitive data if the device is a personal possession rather than belonging to an employer, is to be taken seriously.
More productive staff might mean higher salary bills, and budget saved on buying new PCs over standard product refresh cycles is likely to be offset by a requirement to invest in new BYOD management tools - a growing market sector.
Although BYOD is partly about decentralisation and the freeing-up of choice, it still has to be managed under a policy-based regime. This presents fresh challenges for ICT policy makers. BYOD advocates could yet find that newfound flexibility comes with quite stringent obligations and even a renewed requirement to stick to agreed guidelines, such as acceptable usage policies - employers will be concerned about corporate data sharing a hard disk with potentially inappropriate personal content, say. There is also the possibility that career penalties will apply even when individuals are careless with hardware that belongs to them, and they are liable for replacements should it get lost during work hours.
But the BYOD ethos is pervasive enough to cause IT strategists to consider if it might usefully inform new thinking about future developments in enterprise IT planning. For instance, if fewer enterprise staff are working from hardwired computers, are 'traditional' local-area network infrastructure still needed as much? One scenario along these lines is in smart buildings where more of the available Ethernet communications capacity might be deemed by premises technology managers as better used by other applications as there is less fixed-device information systems data running over it, and more of what remains can be transferred onto internal Wi-Fi connections, where procedures allow.
BYOD could also prompt changes in conventional enterprise voice communications. Email and text messaging are increasingly the primary media for business communications, making the necessity to pick-up first time less necessary. Voice messaging is not commonly regarded as efficient a business tool as it used to be.
A related factor at play here is the fact that VPNs allied to cloud-based software as a service (SaaS) and data as a service (DaaS) options - possibly bundled as virtual private clouds (VPCs) - should mean that less, if any, sensitive employer data will actually be stored on any portable devices. It is also important to bear in mind that the BYOD phenomenon can been seen to some extent as part of a general trend toward 'thin client' style endpoint devices - both fixed and mobile - where very little actual enterprise data actually held on the device itself.
Another factor on the security side is the fact that BYOD-minded organisations are realising the necessity of new approaches to overall risk-assessment that include classifying their data and applying an IT-scale value to it, so that each user data access privileges are better aligned. This seems another factor in support of BYOD, but one that could entail cost implications in a business environment where enterprises are generating data at such rates that the monitoring and classification is a full-time tasks which would require additional IS staff, tools, and the support of third-party services.
A extended version of this 'The Device Divide' post appears in the October 2012 issue of 'E&T' magazine.
Edited: 24 September 2012 at 12:45 PM by James Hayes
Turing's legacy: the celebratory videos
30 May 2012 by James Hayes
Organised by the National Museum of Computing at Bletchley Park, 'Turing and his Times' featured a talk by computer historian Professor Simon Lavington on 'Turing and his Contemporaries', a simulation of the Pilot ACE computer by TNMOC trustee Kevin Murrell, and the first formal public showing of a video commissioned by the National Physical Laboratory (NPL) of the recollections of two of Alan Turing's colleagues.
The three-part videoconsists of:
Part 1 Introduction and NPL video featuring two of Turing's colleagues: Part 2 Prof Simon Lavington talks about Turing's ideas post-1945.
Part 3 Kevin Murrell demonstrates a simulation of the Pilot ACE followed by a Q&A session (includes an unexpected appearance from an audience member who had programmed the Pilot ACE computer!
'Turing and his Times' was the second of three Turing-themed events linking three of the top computing museums in the world. At The Computer History Museum in California on 7 March, historian George Dyson (author of 'Turing's Cathedral') was in conservation with Museum President and CEO John Hollar about the influence of Alan Turing on John von Neumann (and vice-versa) as the digital universe was taking its present form. A video of that encounter is also available.
And in Germany, on 26 May, the Heinz Nixdorf Museum in Paderborn, Germany, hosts an event featuring two short lectures: Professor Dr Horst Zuse talking about his father Konrad Zuse and his computers, Professor Dr Paul Rojas comparing Turing and Zuse, plus videos of their Turing exhibition and the Heinz Nixdorf Museum's working mechanical Turing machine.
And there's information about the forthcoming Alan Turing Centenary Conference - and other Turing content - in this recent E&T news update - 'Turing conference promises unique meeting of minds'.
Edited: 20 August 2012 at 03:43 PM by James Hayes
Strowger redux: Exchange and smart
17 May 2012 by James Hayes
beginning tomorrow, Friday 18 May) will know that even a relatively straightforward mechanism as the Strowger system system - prototyped using pins and collar boxes, legend has it - soon developed into a highly sophisticated piece of electromechanical apparatus.
Interested students of telecommunications history should check-out this 8.5 minute 1951 instructional film, made available online courtesy of the AT&T Tech Channel - 'The Step-By-Step Switch'.
As the accompanying web text explains, the purpose of this film was to show Bell employees how calls were automatically switched through an SxS office, and gives a general appreciation of the importance, complexity, and cost of switching equipment in an average 1950s telephone office. The path of a call is shown as it runs through a demonstration unit. "Careful adherance to Bell System maintenance practices" is stressed.
Avoncroft Museum celebrates Strowger automatic exchange centenary
14 May 2012 by James Hayes
make the correct connections between caller 'end device' and receiver 'end devices'.
American Strowger first conceived his invention in 1888, and patented the automatic telephone exchange three years later. Some reports suggest that he constructed an initial model of his invention from a round collar box and some straight pins.
The centenary will be marked in a three-day celebration beginning on Friday 18 May at the Avoncroft Museum, home of the National Telephone Kiosk Collection. Avoncroft Museum (near Bromsgrove, Worcestershire) is a 15-acre open-air site of historic buildings. The Kiosk Collection opened in 1994, with the support of BT's Connected Earth heritage initiative, and contains examples of all BT kiosks down the decades. The Museum is open from 10.30am to 5.00pm during the Strowger Centenary Event (18-20 May 2012). More details at http://www.avoncroft.org.uk.
Mr Strowger, BTW, was not a technologist by trade, but a funeral director from Kansas City. Legend has it that Strowger's undertaking business was losing custom to a rival whose telephone-operator wife was intercepting and redirecting callers to Strowger to her hubby's parlour.
Alas Strowger himself didn't live long enough to see his brainchild flourish: he died in 1902, a decade before the opening of the Epsom exchange. He was survived by his widow Susan: after her death in 1921, an obituary claimed that she had been sitting on additional 'revolutionary' Strowger designs, but 'had refused to make them public while she was alive because only others would profit from her husband's designs'.
Edited: 14 May 2012 at 09:32 AM by James Hayes
Data de-duplication makes poor fist as mass-media entertainment
27 April 2012 by James Hayes
Host Mitzi Meyers' subject this time was Steffi, a 36-year-old biochemist and part-time webmeistress from a Berlin suburb who obsessively retains all her PC log files: she has millions of such sets going back to her time at university, including all of those from various websites she has administered over the last 15 years. Unlike her fellow data hoarder Günther featured in Tuesday's show (see Buzzsore's 25 April 2012 post), Steffi has the wherewithal to migrate her data onto successively more up-to-date storage media, and has reached the stage where she keeps RAID devices under the floorboards of her small terraced haus.
Also unlike Günther, Steffi remained for most of the programme in staunch denial that she had any kind of problem - "It is not so much that I want to keep all my data, it is just that I do not want to delete any of it," she explained tearfully into the camera. To help Steffi come to terms with her data-doting psychosis Mitzi took her to a data de-duplication consultant in Stuttgart who helped take the first painful steps on the road to recovery: excising all the third and fourth copies of files alone slimmed-down her data sets by several gigabytes, and alles war gut for Steffi; but not for your blogger, who channel-hopped to a Spanish TV documentary about a tribe of Intellectual Pygmies who, a bearded anthropologist explained, worship Melvyn Bragg's BBC Radio 4 programme 'In Our Time', which they listen to every week on an ancient crystal set that was salvaged from a crashed biplane sometime in the 1940s.
Edited: 27 April 2012 at 06:29 PM by James Hayes
Service robots are go
26 April 2012 by James Hayes
trundled around the smooth-floor circuit bordering the company's stand.
This year Bluebotics has been joined by more service robot makers, and there is clear evidence that this technology is moving from clever novelty to commercially-viable application.
Take for instance the showcased products from German firm MetraLabs, its SCITOS G6 Transporter and its SCITOS G3 home-care model. The former resembles an item of cafeteria furniture most people are familiar with: the open-stack cabinet for leaving your tray and used tableware, etc., in. G6 is a robotised version of this eatery staple, programmed to autonomously take the dirties back to the kitchen when all its slots become filled, then return empty to its designated post.
MetraLabs has stuck a bug-eyed little 'head' on the top of the unit, and at first I thought, "Oh, not another daft attempt to make a robot look like a toy human"; but it actually has a practical purpose: it prevents people from overloading the unit by placing items on its top, and also provides a branding opportunity for restaurant chains or their suppliers.
SCITOS G3 is designed to support persons in home environments,nursing homes, and even hospitals. It trundles about displaying an interactive touch screen that can be used for, say, displaying medication schedules. Future possibilities include integrating this with mobile phone-based telehealthcare apps, or even full M2M functionality.
One other point about the mobile service robots at Hannover Messe worth noting: unlike last year, when they were kept corralled within the confines of each exhibitors' stand, this week they have been allowed to wander - autonomously in some cases - around the aisles, so that visitors can inspect them up close, touch and prod them, put their arms round them and have their photograph taken, etc.
This mularkey will in due course present something of an issue for exhibition organisers, one suspects, when their clients want to use mobile robots to distribute marketing collateral (flyers, freebies) to visitor throngs. And how belong before the dreaded health & safety issue crops up, with the first legal action for an injury caused by somebody colliding with an autonomous service robot?
You can't help wondering how many of the past's key breakthrough innovations would have been stymied by H&S if it had been around at the time. Just think of Stephenson's Rocket, for instance ("Sorry Mr S., we can't have the crew setting off in an open-top plate without protective headgear and safety harnesses"); mind you, a bit of 19th century H&S might have saved William Huskisson MP from being knocked down and killed by the steamy locomotive at the opening of the Liverpool & Manchester Railway in September 1830, thus becoming the first victim of a train accident.
Edited: 26 April 2012 at 01:41 PM by James Hayes
Oddball data hoarders get own reality telly outing
25 April 2012 by James Hayes
'ordinary' (but bonkers) people who cannot face-up to discarding redundant computer data. During the course of each episode they are coaxed into junking stuff that they have been hanging-on to for years, sometimes decades, because they they have a pathological inability to get rid of it.
Hosted by genial self-styled self-help guru & lifestyle enabler Mitzi Meyers, the show at first confronts its subjects with the extent of their obsessive retentiveness, with lots of hand-held camera tours of the extent of the problem, dramatic zoom-ins and soundtrack chord changes.
Yesterday evening's Über-Daten-Hamsterers! featured 48-year-old Günther, a freelance governance officer from Frankfurt, who has kept copies (and copies of copies, and copies of copies of copies - you get the point) of every file he has ever created since he first started using a PC in the late 1980s. The spare room of Günther's family flat was stacked with storage media - 5.25 inch disks and 3.5 inch diskettes, tape cartridge and external hard disc drives - I even spotted an old Syquest unit among the chaos.
Mitzi pulled out some discs at random and made Günther view their contents: acknowledgement letters, old invoices, drafts of letters to travel agencies regarding forgotten package holidays, and thousands of photo files of family pets long since deceased and disposed-of. There was gigabytes of downloaded PDFs containing information that Günther still believes "may come in handy at some day".
When confronted by stern but understanding Mitzi into accepting that he has a problem, and that this problem is having repercussions on the psychological well-being of his family (the youngest child has to sleep in the garage because Günther refuses to remove his hoard of ancient, rotting data from the spare bedroom), he broke down in tears and agreed to tackle his problem. Mitzi gave him a big hug, and told him soothingly that "es ist OKAY"...
After the commercial break a crew of burly blokes wearing protective clothing arrived in a rather cool-looking Mercedes truck, and spent the morning chucking the bags and boxes of old disks etc. into waste disposal crates. The disgusting archive was covered in dust and grime, and several cockroaches came scurrying out as stacks of lockable plastic storage cases of disks were disinterred.
Poor old Günther was led away sobbing to stay with some IT clinicians while the spare room was gutted and redecorated by another burly crew of 'fresh-start décor counsellors'. The show ended with Günther being led back into the room blindfolded to find a surprise party held in his honour. Everyone was having a lovely time until Günther's wife discovered a secret hoard of novelty USB memory sticks that he'd been hiding at the back of the cutlery drawer.
Ja, you can bet that I'll be one of the many millions of German viewers who'll be tuning into the next heart-rending episode of Deutschland's Schlechteste Super Daten-Hamsterers!
Edited: 26 April 2012 at 01:43 PM by James Hayes
Cisco gets tough for life on the factory floor
- a sector not normally associated with its heartland mainstream business computing; but Cisco has big plans not only for the smart technology that's coming into factories and assembly lines around the world, but also for the next-generation of technologists who'll be tasked with making it all work - and in manufacturing that means manufacturing engineers, rather than conventional ICT techies who provide CCNP rankers.
In respect to the first, Cisco 'reinforced its commitment to the industrialisation of the Internet' with the launch at Hannover of the Cisco IE 2000 industrial switch series. This class of devices is designed specifically for the build-out of intelligent networks for industrial automation that link the plant floor to enterprise networks.
This glib phrase belies the fact that managing datacommunications within manufacturing plants or assembly lines is not only hugely different to running business applications in offices, but it also requires competent personnel who can work the kit, and integrate it with strong-arm robotics and big metal that stamps and molds and hisses. To this end company has revealed that it is "looking at the possibility of Cisco training and certification for manufacturing professionals", according the Maciej Kranz, VP/GM of Cisco's newly-established Connected Industries Business Unit, speaking exclusively to E&T.
Cisco regards manufacturing - smart manufacturing, more specifically - as a major new potential market for its solutions, and that potential seems so evident that it is surprising that some of its erstwhile competitors haven't grasped the possible benefits of strutting their stuff in front of procurers visiting the world's biggest industrial fair.
An important driver of change here are sensors, Kranz points out: "The growth of sensor-based data is just unprecedented," he says. "They are now being installed in many industrial environments where they haven't been before, or certainly not in such a sophisticated way. Oil rigs, for instance, fitted with sensors for various applications, generate terabytes of data. And as that data builds-up organisations need to traffic it and analyse it. In some ways this is not so much different from the datacommunications that Cisco has been doing for years - the same issues like security and resilience still apply, of course - except that the units have to be much more robust." Kranz's unit's engineers have been busy finding ways to make Cisco kit ready (think ruggedised routers) for these tough environments.
"Barriers between the IT and OT [operational technology] worlds are just breaking down," Kranz avers, as the latter increasing calls for "levels of sophistication that have up till now been the preserve of enterprise IT".
The time has only just become right for Cisco's move. Up until quite recently industrial computing platforms had to have very high levels of reliability and robustness because they were so often operating alongside safety critical systems. Another point not to be overlooked is Cisco's reputation for the build-quality of its existing product lines, and the fact that as a company that has been closely involved in the manufacture of product hardware for over 20 years, it's had plenty of experience of life on the factory floor.
Edited: 25 April 2012 at 02:37 PM by James Hayes
Face-to-face with China's greening?
23 April 2012 by James Hayes
which are leveraging the attention around the event to publicise their respective agendas; but the extent to which Chinese showings at the event could also serve as counterpoints to criticisms of the country's ecological reputation should not be overlooked.
The presence of no less a personage than Chinese premier Wen Jiabao, who opened the event yesterday (Sunday) with German chancellor Angela Merkel, was a high-profile coup for Hannover Messe, to be sure, but one that also brought added impetus to his government's critics. This blogger wasn't present, but Yahoo News reports that a crowd of some 200 demonstrated outside during the ceremony.
At Kröpcke, the Stadtbahn station closest to the hotel where Wen Jiabao met with various civic dignitaries, Amnesty International has taken out a series of large advertisements on the Messe-bound platforms, highlighting 'Human rights made in China'. And as delegates and exhibitors in their thousands step-off the trains at Messe-Nord station they are greeted by at least two bannered protest groups camped outside the entrance distributing flyers.
At the same time it's worth noting that while the partnership with Hannover Messe 2012 is principally about generating commercial/cultural opportunities, it is also couched in the context of change: China's Ministry of Industry and Information Technology, for instance, has announced plans to increase R&D investment for 'enabling technologies for eco-friendly vehicles' geared toward a commitment toward the production of mass-market electric vehicles.
Of course anyone cognisant of declarations of environmental protection will be aware that promises and deliverables in this thorny area are prone to disparities the world over. But it's reasonable also to recognise also that Chinese exhibitors are not in Hannover without bona fide 'green' technology on their stands (no pejoratives meant by those inverted commas, BTW - no technology is ever 100 per cent environmentally clean).
Even legitimate criticisms of China's environmental record should not discount the possibility that the sustainable technologies being showcased at Hannover Messe are aiming to find solutions to the problems - both in the People's Republic and anywhere else in the world where sustainability abuse is evident. High-attendance and open events like Hannover Messe provide visitors an opportunity now to examine the technological claims at close quarters.
Looking to the future of man-machine interfaces
6 March 2012 by James Hayes
In the enterprise space eye tracking interfaces will change the nature of how we work, but may also change the ways in which we contrive to avoid it - for short 'allowable' periods, that is. A consideration of the history of our relationship with typewriting is instructive here. In the old days professional typists rarely hammered away at their keyboards for recreational purposes, or because they were drafting a message to their social network confreres. Back then, productivity was pretty easy to gauge: it started and stopped when typing was heard and not heard.
PCs changed all that: now who can actually tell why you're tapping away at that computer: is it really work, or is it 'non-work-related'? Are you applying yourself diligently to the 'agreed' employment tasks you've been set - or catching-up on advances in latest personal social media tools? Even when an inquisitive manager looms Blakey*-like behind you you just toggle the work spreadsheet to full-screen to effect a decoying manouver.
Such subterfuge at least gives the impression of busyness, and most bosses are happy to overlook cyber-skivers so long as the work of the day gets done (it's not fair: if an employee decided to simply sit gazing into space in lieu of cyber-skiving they'd be down to HR on a fizzer pronto).
Enter eye tracking: using image sensors and image processing to convey commands and instructions to the computer using movements of the operator's eyes. Technology being showcased at CeBIT this week by one of the market leaders, Swedish firm Tobii aims to take eye tracking beyond niche use, and put it to broader use, and that includes finding ways to converge it with conventional applications such as process control and even surgery.
It is in the area of standard office applications that eye tracking perhaps faces its biggest challenges, especially when it comes to monitoring individuals' productivity. When gaze interfaces come in manager will know even less about when their staff are really engaged in: responding to a sales inquiry or watching YouTube?
And what will be the implications for the hard-won human skill of multi-tasking? I'm drafting this blog entry in the CeBIT press centre while concurrently engaged in glancing at emails, search engine results, and my netbook's remaining power, while also acknowledging a passing hack who I haven't seen in donkeys, and glaring at the fatuous blogger who is loudly recording his podcast at the desk next to me instead of doing it in an empty room (one that's about 10km away, ideally). Will, come the era of eye tracking, such visual flittery be recalled with the same nostalgia as the typewriter generation now look back on the mixed aroma of new ribbons, carbon copies, and Tipp-Ex paper?
*That's Blakey the lurking Inspector of TV's On the Buses, BTW, not the Jazz drummer.
Edited: 07 March 2012 at 08:49 AM by James Hayes
40-year-old satellite brought back to 'life'
6 September 2011 by James Hayes
BBC News reports that group of scientists and engineers is working to revive a UK satellite that's been in orbit since 1971.
Prospero was the first UK satellite to be sent up on a UK launch vehicle - a Black Arrow rocket; it would also be the last.
Having now discovered the codes to contact the satellite, engineers say that they still have to build equipment to 'talk' to the satellite, and then must win approval from the broadcast regulator Ofcom to use Prospero's radio frequencies - these days being used by other satellite operators.
Applying the splinternet
22 July 2011 by James Hayes
cloud computing, which Ananthaswamy (perversely) refers to as "one of the biggest Internet innovations of the past few years", whereas as anyone involved with public-facing ICT likes to remind the laity, is in fact a long-established model that's been adroitly redressed by commercial imperatives.
Ananthaswamy further adds that "Some companies have moved their entire IT departments into the cloud... [But] the cloud could generate exactly the single points of failure that the internet's robust architecture was supposed to prevent": this again is somewhat wide of the mark in that unless it is being missold and/or done on the cheap, the whole point of cloud is that it should not constitute a single point of failure - all the key data sets should be virtualised around the Web in various locations. Multiple points of failure - well, that's a different matter...
Edited: 25 July 2011 at 09:10 AM by James Hayes
Will cloud tale wag data centre doggedness?
6 May 2011 by James Hayes
Some commentators suspect that the two entities have already become inextricable linked - but are we heading toward a case of tail wagging dog, where cloud diktats lure data centre science off in a wayward 'direction of travel'?
In seeking an answer to that question it is important, some aver, not to over-estimate the nebulous nature of the cloud construct.
Colt's executive vice president of infrastructure services unit Mark Leonard's view is that cloud computing is actually "an IT operating model - it is having an impact on data centre design," he says, although any perceived associations between cloud and consumer confidence should not be over-rated, Leonard adds: with the fundamental abstraction of cloud services from the underlying infrastructure, it is "more likely to weaken any perceived linkage between cloud service and data centre - especially in the consumer context".
It's a fair point: cloud computing in its essentials is not much more than an operating model (not unlike the OSI's multi-coloured Seven Layer Model, say) with a fluffy name that at the same time lends itself too-readily to marketing. There's where the OSI missed a trick - had it've been more marketing savvy, could the 1990s have now been known as the era of 'Rainbow Computing'?
Edited: 16 May 2011 at 06:02 PM by James Hayes
Could Amazon cloud burst bring a silver lining for data center revenues?
The line-up featured a sextet of stalwarts and fund managers from the banking and capital markets world; money men less interested in hot aisles than hot investments. While acknowledging that the business models deployed by data centre operators varied, and that only a small proportion of such players are publicly-quoted companies, how do investors see data centres as money-making propositions?
The panel's take on technological advances was predicated on one concern: will this improve the value of my data centres sector stock, or it devalue my holdings? Sounds dullish, but actually the capitalists provided some refreshing insight on talked-up trends.
One point that brought forth consensus was that, no matter how healthy prospects appeared - demand on the up, existing data centres being expanded, innovations appearing to address energy consumption, IT efficiency, cooling and ventilation, etc - it only takes one mass-market scare story to create a worrying wobble (although chances of a dot-com bubble 2.0 are minimal). We're talking 'bout the recent Sony and Amazon mishaps, of course. Global CTO at Devonshire Investors Frank Lukas highlighted the fact that reports of the Amazon outage had given impression that they occurred as a result of data centre problems: "In fact, as we know, it wasn't a facilities-based outage, [but] the knee-jerk reaction was that it was facilities-based," he said. "In fact it was the software that broke"; but popular perception among so-called 'technoramuses' is that service failures are caused by hardware failures, rather than soft system errors.
This was not good news for data centre operators, but added Jonathan Atkin, MD of RBC Capital Markets, the adverse publicity may have its positive side: "It might cause customers to spend more with their data centre service provider, and pay for extra resilience" - something which in 2011's penny-pinching economic climate, many have shied away from, he believes.
Agreement came from Jules Delahaije, CEO of Linxdatacenter: "These type of disasters help a lot. They challenge any notion that you can do these things on the cheap." They also help displace business away from the big, mega-centralised data center operators to the more decentralised players who can provide help to smaller customers when hiccups occur.
So according to the financial observers, Amazon's cloud burst may have a silver lining after all...
All quiet on the questing front?
5 May 2011 by James Hayes
It's a minor conspiracy theory, but one that has excited some debate at the first day of the Data Centres Europe 2011 conference, both in the open plenary debates and among the networking break coffee congregations.
More pessimistic opiners have gone so far as to suggest that any further such mishaps involving 'cloud' will cause the technology to have to change its name in an effort distance itself from lasting negative associations, and then pursue a 'new life' under a different market identity. Others are not so sure that the 'cloud computing' term was ever properly suitable as a generic label.
"Cloud computing used to be called 'utility computing'," says Gregor Petri, who enjoys the job title of 'Advisor, Lean IT & Cloud Computing' at CA Technologies, speaking at a midday panel debate billed as 'Which cloud to follow?'.
Making utility computing sexier and more mass-market-friendly, by renaming it 'cloud', was an agreeable side effect of the reinvention, to be sure; but if the moniker becomes synonymous with risk, then the industry would have to consider another name change. "That's why they [the IT sector at large] is not talking about it [the Sony breach]," he suggests; besides which the IT industry has never been entirely comfortable with 'cloud computing' as a generic technology descriptor, adds Petri: "Cloud is not an acronym - and the IT industry lives by acronyms".
So what should cloud computing be rechristened as? Your suggestions invited below, please...
Edited: 28 June 2012 at 04:42 PM by Buzzsore Moderator
FuseTalk Standard Edition - © 1999-2014 FuseTalk Inc. All rights reserved.
"There has been a lot of talk about the reported £30bn cost of the Sochi Games, so we go behind the scenes to find out where all that money has been spent"
- 3D Magnetic field rotation of light [09:39 pm 10/03/14]
- Repeated Alternator Failure on Power Plant rated 16MVA/ 11,000V using 12x 415V generators [06:53 pm 10/03/14]
- How and when will DECC's electricity capacity market fail? [01:30 pm 10/03/14]
- Technology that has died since the year 2000 [12:14 pm 10/03/14]
- what about intematix led assembly ? [11:33 am 10/03/14]
Tune into our latest podcast