Title : 4G-FLOPS supers launched Author : CW Staff Source : CW Comm FileName: fuji Date : Dec 19, 1988 Text: TOKYO _ Fujitsu Ltd. unveiled a new weapon in the race to provide more computational muscle with its recent announcement of a series of supercomputers that the firm said offer more than twice the performance speed of its current models. The eight-member VP2000 series sports a vector-processing speed of 4G floating-point operations per second (FLOPS) in a single processor, more than doubling the 1.7G-FLOPS peak performance of the firm's VP series E model, Fujitsu officials said. Four models are also based on Fujitsu's newly developed dual-scalar-processor architecture. This design adds a second scalar unit to the typical uniprocessor configuration of one scalar unit and one vector unit, thus enabling system throughput to be doubled, the firm said. The supercomputers run on Fujitsu's proprietary MSP operating system as well as Unix. They can reportedly handle a maximum main storage capacity of 2G bytes and feature a system storage capacity of 8G bytes. Analysts praised the offerings for their ability to exploit leading-edge technologies. ``The most interesting thing about the machine is its very aggressive use of very advanced technology _ circuit technology in particular,'' said Omri Serlin, president of Itom International, Inc., a research and consulting firm based in Los Altos, Calif. The series uses 1M-bit static random-access memory chips boasting a 35-nsec access time, Fujitsu officials said, while the CPU is peppered with high-density chips. Some uniprocessor models are scheduled to be available in the fourth quarter of 1989, but the full range will not be ready for shipment until the second half of 1990. Although no final purchase prices were released by the firm, rental prices for the series will begin at $300,000 per month. By James Daly, CW staff <<<>>> Title : Pure PC wimpiness Author : CW Staff Source : CW Comm FileName: colum1 Date : Dec 19, 1988 Text: It's time to clone the Mac! Often when you hear about someone getting ready to buy a personal computer, they talk in loving terms about Apple's Macintosh. That is because deep down, most people want a Mac. But these same people turn around and buy an IBM clone through the mail from someone like Dell Computer. Is Dell's better? If sophisticated software and advanced user interface techniques are any measure, heck no. But if the measure is pure value, then Dell or IBM or any other PC cloner is the better buy. That's the beauty of competition. For less than $3,000, users can pick up a fast IBM-compatible system with high-resolution color graphics, a 40M-byte hard drive and expansion slots galore. For the price of a similarly equipped Mac, you could buy a Hyundai with air-conditioning and toss in a used DEC Rainbow. In the IBM-compatible market, the competition is gloriously cutthroat. Each supplier, including IBM, is forced to give more and more for the same price or else get out of the busi- ness. Microsoft's MS-DOS operating system is still subpar, but customers love the values. In the Macintosh market, Apple is the only supplier in town. It controls the prices, but more importantly, it controls the technology. That is why a low-end Macintosh comes with one floppy disk drive and a tiny black-and-white monitor. And performance? It is so slow, you can watch your car rust while you load a program. PC wimps Users who love the Mac's approach to simplicity but don't appreciate the poor price/performance ratio can easily blame Apple. Might as well. But they should also lambaste the cowardice of the IBM cloners that have been too afraid to irk the folks from Cupertino, Calif. Pure PC wimpiness. Of course, these are the same clone companies that are whimpering about IBM's call for Micro Channel royalties. Get some pride. Clone both, and let the lawyers worry about it. Besides, what's fair is fair. Are IBM cloners going to let Apple work with firms such as AST Research and Phoenix Technologies to enable the Macintosh to run Microsoft's MS-DOS _ but God help you if you clone the Macintosh? This isn't ironic. It's dumb. Admittedly, the Mac was not worth cloning in its early years. It was pathetically slow and had lousy software. In short, it was a computer for the terminally stupid. Despite the lack of competition, the Mac has grown up. But the allure of the Mac has more to do with innovative software developers than with Apple's ingenuity. A major advance for Apple is still something like color, a separate monitor, expansion slots or maybe a key pad and cursor keys. Yippee. Cloning around While Apple dawdles along, everyone harps on IBM for its so-called proprietary architecture. Come on. IBM lets cloners clone _ as long as they don't steal BIOS code _ and is even making MCA licenses available in a semiattractive fashion. But what does Apple, that California-based bastion of hollow counterculture values do? Have its lawyers threaten to sue anybody who even looks like they are cloning the Mac. That is not the worst of it. Apple, confident about its lock on the market, earlier this year had the gumption to raise prices. Apple shops had to put up with it, while IBM clone shops congratulated themselves for sticking with MS-DOS. The PC folks can't get away with this type of gouging. With PC clones, you can play one vendor against another. With Apple, you get a price list etched in stone like the Ten Commandments. You will pay, you will not call us with your petty problems and you will blindly follow the technical direction we set. Hey, instead of the Extended Industry Standard Architecture consortium working toward another 32-bit bus _ the Micro Channel and Nubus are just fine _ why don't they try cloning the Mac and putting together a huge defense fund? Then at the same time, maybe they could speed the thing up a little. By Douglas Barney; Barney is a Computerworld senior editor, microcomputing <<<>>> Title : Adapting tools to work gr Author : CW Staff Source : CW Comm FileName: group Date : Dec 19, 1988 Text: End users often have had to adapt their working habits to fit their software. Now, an entire new class of software promises to adapt to the working habits of end users. It is called groupware, and vendors in the market claim that their programs can turn personal computer networks into powerful systems for coordinating the work of several end users. Cooperative ventures, no matter what field of endeavor, are based on patterns of work such as setting goals, scheduling tasks, monitoring progress and performing similar activities. Groupware aims to support these cooperative efforts by coordinating the activities of each member within a group. This coordination of effort goes beyond simply sharing word processing, spreadsheets, electronic mail and other applications common to PC networks. ``We're building a product that lets end users create a coordination environment based on patterns that underlie how people work,'' explained a spokesperson at Coordination Technology, Inc. (CTI), which is based in Trumbull, Conn. In this coordination environment, for example, senior executives could attend a meeting electronically and exchange information derived from a wide variety of sources. This information, ranging from voice to video, could be integrated into an action plan that could then be routed to middle managers who would implement the decisions of the group. Each middle manager would receive a transcript of the meeting tailored to the goals of his department. In another scenario, an information center manager could provide support and training activities that are precisely geared to the needs of each individual in the coordination environment. A question from one end user in the group could be automatically directed to the end user most likely able to answer it. ``It's a whole new way of working in an electronic environment,'' the CTI spokesperson said. `Fascistware' But this ability for end users in the group to urge their colleagues to take specific actions, to monitor each other's work and similar activities has led some pundits to deride groupware as ``fascistware.'' A year ago, the market for groupware barely existed; there were only two vendors selling this new class of software. Those pioneers _ Action Technologies, Inc. in Emeryville, Calif., and Conetic Systems, Inc. in Leandro, Calif. _ are about to face considerable competition from several companies, according to International Data Corp. (IDC), a market research firm in Framingham, Mass. Among the companies eyeing this market are Wang Laboratories, Inc. in Lowell, Mass., Wordperfect Corp. in Orem, Utah, and Informix Software, Inc. in Menlo Park, Calif. IDC predicted that Redmond, Wash.-based Microsoft Corp. will soon announce plans to venture into groupware. The floodgates to what will be a lucrative market will soon open, reported IDC in a recent study. Groupware on local-area networks will account for 23% of the 192,000 licenses shipped worldwide in 1992, IDC said. It also predicted that worldwide revenue for LAN-based groupware will climb from about $24 million in 1988 to $218 million by 1992 _ a compound annual growth rate of 87%. While vendors in the groupware market are likely to tout ``increased productivity,'' they should spend more effort developing standards, IDC added. By Michael Alexander, CW staff <<<>>> Title : Kaypro airs Micro I space Author : CW Staff Source : CW Comm FileName: kaypro Date : Dec 19, 1988 Text: SOLANA BEACH, Calif. _ Most foot-wide computers that weigh in at 15 pounds are called laptops or portables. Not Kaypro's. Recently, Kaypro Corp. announced what it calls the world's smallest desktop system. While most computers of this size are designed for portability, the Kaypro Micro I is aimed at saving desk space, the firm said. As more and more people are crammed into smaller and smaller cubicles, every inch counts. The Micro I saves space in several ways. It uses a high-resolution LCD display, similar to ones found on laptops. Instead of the three to five expansion slots that most personal computers contain, the Micro I comes with just one. Instead of bulky 5 -in. floppy drives, the Micro I uses two 720K-byte 3 -in. floppies. If users prefer a hard disk, they can swap it for one of the 3 -in. floppy drives. The system sells for $799 without a display. An optional LCD screen costs another $249. The product is available now without an expansion slot and is slated to be available early next year with an expansion slot. The Micro I uses the V-20 processor from NEC Corp., which provides compatibility with Intel Corp.'s 8088. The machine comes standard with 512K bytes of random-access memory, a parallel printer port and an RS-232C asynchronous serial port. The Micro I will be sold via mail order and will also be available through Kaypro retailers. By Douglas Barney, CW staff <<<>>> Title : Netview cornering U.S. ma Author : CW Staff Source : CW Comm FileName: netview1 Date : Dec 19, 1988 Text: LOS ALTOS, Calif. _ Users seeking to combine voice and data networks are expected to turn increasingly to IBM's host-based Netview system, according to a recent report that also predicts IBM will emerge as the clear leader in the U.S. network management market. Netview's growing popularity derives partly from IBM's huge installed base of Systems Network Architecture users and partly from its ``superior'' data management capabilities, according to ``The IBM Directions Report,'' available next month from International Technology Group (ITG). ``If you are predominately IBM, Netview is the way to go,'' said Thomas Nolles, president of CIMI Corp., a Haddonfield, N.J., consulting firm. Also, ITG cited a number of major enhancements on the way from IBM, including a comprehensive software distribution facility, dynamic voice/data network reconfiguration and new lines of specialized network management equipment. Further expansions of Netview will include support for international networks, Integrated Services Digital Network (ISDN) installations and facilities management. Going IBM's way A number of broader market trends are also working in IBM's favor, according to the report. One trend is users' increasing focus on the data side of networking as data traffic takes up an increasing proportion of companies' communication use. Data traffic made up about 26% of 1988 traffic volume in Fortune 500 network installations. This level is increasing and is expected to reach 57% by 1990. Also in Netview's favor is the growing number of companies that have combined voice and data management under a single operation, which is then headed up by MIS in more than 70% of cases, according to ITG. As networks become more complex, users will increasingly turn to the data management capabilities of mainframes, ITG said. The high costs of networking will also push users toward IBM, ITG predicted. In 1987, communications managers in Fortune 500 companies spent about 7% of their budgets on network management equipment. ITG predicts this will more than triple to 22% by 1991. IBM remains one of the few vendors capable of offering comprehensive and relatively low-cost network management support and services, the report said. ``IBM will deliberately price its offerings inexpensively to gain market share and build account relationships in this area,'' the report said. Another Netview strength is broad industry support: More than 30 vendors now support the Netview/PC interface to the host-based system. This may cause some users to go with IBM's proprietary system rather than wait for the Open Systems Interconnect standard, Nolle indicated. He cautioned, however, that most vendors' Netview support is limited to sending network alerts up to the host-based system. ``It's one-way support; you can't send any information back,'' he said. Despite this limitation, Netview is still in a strong position, just because IBM is IBM, ITG said: ``Investments in IBM systems and software, and the more than 35,000 SNA networks already installed worldwide, provide IBM with a great deal of leverage in addressing the requirements of corporate network users.'' By Patricia Keefe, CW staff <<<>>> Title : ISDN standards ratified Author : CW Staff Source : CW Comm FileName: nzisdn Date : Dec 19, 1988 Text: AUCKLAND, New Zealand _ Product developers and manufacturers keen to be a part of the new telecommunications environment can take heart that official standards for Integrated Services Digital Network (ISDN) were recently ratified. The CCITT plenary session in Melbourne, Australia, unanimously accepted the 2B+D proposal for two 64K bit/sec. B channels and one 16K bit/sec. D channel for signaling, which has been under discussion since the last plenary session four years ago. ``They will now address the proposed broadband standards, which will enable slow-frame moving pictures to be transmitted,'' said New Zealand's CCITT delegation head, Jack Skurr. The result of the various discussion groups will go forward to the 1992 plenary session in Geneva for ratification, he said. ``The telecom community has accepted, for some time, that the concept of ISDN is fairly well in place and accepted it as the direction in which everyone is going. The only things at issue are when telecom providers want to go ahead and do it and how many customers want the service,'' he said. ``ISDN is no longer looked upon as an innovation subscribers don't need,'' Skurr said. ``Everyone believes it has a firm place in the future development of telecommunications systems.'' ISDN, a networked service combining data, voice and images on the same lines, is scheduled to undergo trials in New Zealand in January 1990 and be available commercially the following year. Skurr, a consultant employed by the New Zealand Department of Trade and Industry, said about 400 delegates from around the world attended the CCITT conference to consider a series of papers that stood one meter high. ``There was no way we could get down to details, but we did look at ways of speeding up the whole process and minimizing the cost while still meeting the needs of member countries,'' he said. A resolution was passed during the assembly to ensure that recommendations finalized out of study groups could be passed before the quadrennial plenary sessions. A constitutional change to allow the speed-up will go through to the plenipotentiary meeting in Nice, France, in June. An agreement was also reached to increase the use of polling groups to bring back consensus views to the meetings. About six members from the New Zealand delegation attended the CCITT conference at any one time. A follow-up meeting late last week attended by many of the same CCITT delegates considered a controversial 11-page paper that flies in the face of telecommunications deregulation [CW, Dec. 5]. By Keith Newman, IDG News Service <<<>>> Title : Sungard adds downtime ser Author : CW Staff Source : CW Comm FileName: sun5 Date : Dec 19, 1988 Text: NEW YORK _ Sungard Recovery Services, Inc. last week unveiled several networking and satellite services designed to help customers cope with computer downtime caused by communications and processor failures. In addition to the services, the company introduced Chairman and Chief Executive Officer Kenneth Adams, the former president of Sungard Trust Systems, Inc., the trust accounting unit of Sungard Data Systems, Inc. Adams replaces Richard Aldridge, who resigned for personal reasons, according to the company. Sungard's Switched Satellite Service is a very small-aperture terminal (VSAT) communications alternative to Earth-based service for communicating with a Sungard hot site. From above Installations will require a permanently installed VSAT dish costing $3,400 per month, including the cost of the dish, modems, cabling and other equipment. VSAT services will carry a rate of $300 to $750 per hour, depending on the selected bandwidth _ from 56K to 1.5M bit/ sec., Vice-President of Network Services Jim Domanico said. The nationwide service is slated to become available by year's end. A Southeastern bank is serving as the beta-test site. According to Norm Harris, president of Harris Devlin Associates, a contingency planning service and Sungard subsidiary in Dublin, Ohio, ``The satellite service will greatly improve communications capability and decrease the time necessary to restore network operations after a disaster.'' The company is also building a T3 multiplexer communications backbone to service major East Coast cities. The East Coast run The first segment will run from New York to the Philadelphia hot site with nodes in Newark and Princeton, N.J. According to Domanico, the Sungard Network Access Point's 45 services, which divide and reroute T1s, will run dual-communications paths and will be able to switch carriers in an outage. Prices range from $100 to $2,000 per month, depending on the number of T1 lines used, and are slated to be available in the first quarter of 1989. Another communications offering allows customers to contract for an inventory of cellular phones in the event of communications problems. While the services are limited to voice presently, they will accommodate data communications by spring 1989, Domanico claimed. The company is also introducing disaster recovery services at its San Diego facility for Digital Equipment Corp. computers. The service offers a cluster of two VAXs and 128M bytes of storage. Monthly charges range from $1,500 to $5,000. Customers will also be required to pay a declaration fee of $25,000. By Robert Moran, CW staff <<<>>> Title : Governing concern Author : CW Staff Source : CW Comm FileName: centers Date : Dec 19, 1988 Text: CLEVELAND _ For all the challenges and problems associated with managing end-user computing and information centers, two common themes dominate all of the managers' actions: how quickly user computing should expand and what are the appropriate controls on such computing. That was the key finding of two researchers who surveyed information center managers in 37 major U.S. and Canadian corporations. The results are sceduled to be published this week in the Cleveland-based Association for Systems Management's Journal of Systems Management. The researchers noted that with most of an information center manager's actions being classified as expansion- or control-oriented, the manager can use a grid concept to rate the degree of expansion and control that is appropriate for his firm and then match his end-user computing strategy to the corporate plan. ``Success in managing end-user computing is contingent upon what the organization itself is trying to do. It depends on what the senior managers are trying to do. For example, if you have a senior manager whose dominant feature is going slow with a lot of controls, then you can plan your end-user computing with slow expansion and a high degree of control,'' said Sid L. Huff, an associate professor of MIS at the University of Western Ontario. Huff co-authored the research work with Malcolm C. Munro, professor of MIS at the University of Calgary. Huff said that under a more dynamic corporate management, the information center should plan on a high degree of end-user computing expansion and controls. He noted that controls include not only restraints but also direction and guidance for users. By James Connolly, CW staff <<<>>> Title : Still tinkering, Block dr Author : CW Staff Source : CW Comm FileName: block Date : Dec 19, 1988 Text: He may not be a whiz kid any longer, but Arthur Block is a leading-edge kind of guy. While most of his contemporaries are buying personal computers with a megabyte or so of random-access memory, this Manufacturers Hanover Corp. vice-president in charge of end-user automation support is getting ready to buy machines with 8M bytes. While others are largely picking up Intel Corp. 8088 and 80286-based machines with a scant sprinkling of Intel 80386s, Block demands a good reason to buy anything less than an 80386. And while most users stick to the tried-and-true character-mode interface for most applications, Block is already solidly in the Microsoft Corp. Windows graphical user interface camp. Young tinkerer Block's background provides more than a few clues as to why he picks up the latest and greatest and puts it to use. He is still very much the tinkerer who built electronic gadgets as a youth. Block, now 46, claims that as a 9-year-old boy growing up in The Bronx section of New York, his ambition was to become an electrical engineer. As a science student in the early 1960s, Block was one of the few captivated by computers. He learned almost instantly. As a college junior, he taught the first computer course ever offered at the Stevens Institute of Technology. After that, there was no turning back. Electrical engineering was out, and programming was in. At the time, mainframes were the only game in town, so that is what Block programmed. In fact, he got his first job at Manufacturers Hanover in 1975 as a project manager, working exclusively on mainframes. Later, as PCs hit the streets, Block was enthralled. As these computing devices began to seep into the company, he was there, fiddling, tinkering and managing. Eventually, the self-proclaimed PC agitator made a clean break from mainframe duties and pursued microcomputer implementation. It is this enthusiasm for technology that makes Block different. With enthusiasm has come understanding. ``In the past, I have been technically superior to those I worked with and for. Now I have a boss that I have difficulty keeping up with technically,'' notes Peter T. Keilty, vice-president in charge of the PC development group at Manufacturers Hanover. How does Keilty deal with this? ``Lots of preparation.'' Technology is more than just a job for Block. It is a big part of his life, say those that work with him. According to Keilty, Block takes products home for serious testing before they are used at the company. At least, that is one of the things he does when not installing security systems, building furniture or adding rooms to his New Jersey home. Fortunately for Block, the company's senior management is pro-technology and willing to shell out the bucks to reap the benefits. ``We had senior management at the bank that recognized that we really hadn't done much to automate our front offices or account offices,'' Block explains. This recognition started him on an automation path that has yet to end. It began in 1986, when the firm's management funded a project called Infonet, which was designed to automate the desks of these account offices. Naive users The problem was how to do it. Block knew he would be computerizing a large group of naive users. He worried that they would be frustrated by the different user interfaces presented by the various PC software offerings. Things as basic as saving a file or printing were done differently based on the whims of the programs' authors. ``That was absolutely unacceptable,'' Block said. The answer was a graphical user interface such as that found in the Apple Computer, Inc. Macintosh or Windows. Although the Mac interface was more advanced than Windows, Block had a little problem with its proprietary architecture. ``If Apple decided to go off in a different direction or decided to, God forbid, raise prices 30%, you were stuck,'' Block said, referring to Apple's decision earlier this year to boost prices across the board. Manufacturers Hanover agreed with Block and standardized on Windows for its account officers, which has worked out just fine for Block. His programming team has transformed various Windows applications into a system specifically for the banking business. On the menus, users see banking terms, not spreadsheet or word processing options. In fact, users have no idea they are actually using a spreadsheet or database package. Instead, they will see choices such as profitability and credit analysis. This system comes at a cost, however. The system uses expensive 80386-based computers, high-resolution monitors and gobs of memory. However, Block is automating the jobs of key and senior employees, whose time and decisions are valuable. ``Account officers can put together a proposal in a day or two where it used to take a week to a week and a half,'' Block explains. And because all the PCs are tied to laser printers, the quality of the output is better than it used to be. You ain't seen nothin' yet Windows is the beginning. Even though most users plan to begin a migration to Microsoft and IBM's OS/2 Presentation Manager several years from now, Block will get started next summer. That is why Block is about to start buying his machines with 8M bytes of RAM. Block is also using database technology that most others are just starting to talk about. While the industry is buzzing about a micro database future characterized by IBM's SQL and the so-called client/server architecture, Block has it in place. The account officers' workstations are tied to Gupta Technologies, Inc.'s SQLbase, which provides multiuser access to a common database. Block's leading-edge mentality creates both challenges and risks, according to those who report to him. Often, he is deciding how to use technologies that are not even finished. ``There is a dilemma of trying to work with unstable technology,'' said Rich Luciano, a vice-president who reports to Block. Like Block, Luciano says he believes that using the latest and greatest pays off. ``We will see it on the bottom line,'' Luciano says, pointing to improved methods of evaluating the bank's risk and exposure in investments and improvements in marketing to customers. Not only do those reporting directly to Block believe in the pay-off; so, apparently, does the company's senior management, which continues to fund Block's high-end approach. And so, of course, does Block. By Douglas Barney, CW staff <<<>>> Title : Firestone MIS exec faces Author : CW Staff Source : CW Comm FileName: fsexec Date : Dec 19, 1988 Text: CHICAGO _ As Firestone Tire & Rubber Co.'s new executive director of MIS, Robert L. Malizia faces many challenges, including guiding his MIS department through Firestone's recent acquisition by Bridgestone Corp. Malizia, a 20-year veteran of the data processing industry, began his career at the Ford Motor Co. and most recently was manager of computer and communications services at Firestone in Akron, Ohio. He has also been employed by Navistar International Corp. and Reynolds and Reynolds Co. ``One of the things we're going to have to do is develop global systems to develop the two companies across Europe and the nation,'' Malizia said of Firestone and Bridgestone. Over the next five years, he added, the two tire companies will continue to expand throughout the world marketplace, with MIS involved in most phases. ``We need to be very flexible and able to build products anywhere in the world, ship anywhere in the world and react to pressures from changing tariffs and changing money values.'' Culture mesh An additional challenge lies in merging the diverse cultures of U.S.-based Firestone and Tokyo-based Bridgestone, according to Malizia. ``I feel that we have some very exciting years ahead; these are exciting challenges,'' he said. Currently, the Bridgestone and Firestone MIS departments are operating as separate entitites, Malizia said. ``We're in the planning stage with Bridgestone as to what kind of working relationship we're going to have with them long-range. Right now, we are working as equal vested partners.'' Malizia holds both a bachelor's and a master's degree from Eastern Michigan University. At Firestone, he reports to the chief financial officer, Robert Anderson. He replaces Laurance T. Burden, who has moved to S. C. Johnson & Son, Inc. in Racine, Wisc. BYAlan J. Ryan, CW staff <<<>>> Title : Young lions emerge in CDC Author : CW Staff Source : CW Comm FileName: cdcuni Date : Dec 19, 1988 Text: Unisys Corp. and Control Data Corp. both announced top management reshufflings last week that established some executives as potential heirs apparent and sent others packing their bags. At CDC, Chairman and Chief Executive Officer Robert M. Price ceded his president's title to Lawrence Perlman, who engineered the dramatic turnaround of CDC's storage products business, one of the few bright spots in the Minneapolis-based company's inconsistent recovery efforts. Perlman, 50, who headed CDC's former Commercial Credit Corp. subsidiary before moving to the storage business in 1985, was also named to the new position of chief operating officer. One more step Executive Vice-President John K. Buckner, 52, also took a step up the ladder to the new title of vice-chairman while retaining his post as chief financial officer. Departing CDC as a result of the reshuffling was Thomas C. Roberts, president of computer systems and services. Perlman will be acting president of that unit and will also continue to head the Imprimis storage business until a successor is found. Like 58-year-old Price, Unisys Chairman W. Michael Blumenthal yielded some of his responsibilities to newly promoted executives, a strong sign that they are in the running to someday replace the 62-year-old Blumenthal. In moving from senior vice-president to executive vice-president, CFO Curtis A. Hessler will become responsible for Unisys' defense systems business, which formerly reported to Blumenthal. Hessler was also named to the board of directors and will head Unisys' financial and strategic planning. Other apparent winners in the changes were Executive Vice-President James A. Unruh, whose duties expanded from heading international marketing to all commercial marketing, and Convergent, Inc. Chairman Paul C. Ely Jr., who was named to the Unisys board. Ely will become a Unisys executive vice-president after expected shareholder approval this week of the acquisition of Convergent by Unisys. Writing on the wall Unisys also announced a significant departure _ Senior Vice-President Jan Lindelow, the last remaining member of the former Sperry Corp. top management team at Unisys. Lindelow will pursue other unspecified interests. Many observers felt the writing was on the wall for former Sperry executives at Unisys one year ago when former Sperry CEO Joseph Kroger departed. He now heads Biin, the joint venture of Intel Corp. and Siemens AG. By Clinton Wilder, CW staff <<<>>> Title : Get used to merger fever Author : Charles Varga Source : CW Comm FileName: varga2 Date : Dec 19, 1988 Text: Is the current merger mania in the computer industry part of the natural growth process? Or are we seeing the fat cats, investment bankers and carpetbaggers scoring a quick land-grab, riding the merger wave's momentum and the coattails of all those big-bucks megadeals that make the news? Last year, in an attempt to vent my feelings through comic relief, I invented a ficticious merger-and-acquisition firm and started answering my telephone: ``Pillage, plunder, plague, ravage and burn. Hun-Khan Partners: We sack 'em, you loot 'em.'' To my surprise, although many people thought it was funny, few really understood my point. Be that as it may, users and vendors have resigned themselves to playing in an industry in which consolidation is the order of the day. ``Acquisitions and mergers are neither good nor bad; they are a fact of life,'' says Michael Jones, director of information systems at Knight-Ridder, Inc. There are real concerns about the effects on acquired firms' clients and users. Bill Mann, vice-president of MIS at Wickes Co., says, ``It's very traumatic to have your large vendor acquired. All of a sudden you're dealing with someone you don't know. Are they going to support the product, in which you have invested a great deal of effort and money?'' Jim Moran, vice-president of corporate planning and development at H & R Block, concurs. ``Some are good and some are bad,'' Moran says. ``The bad ones are characterized by a great many layoffs, and I don't think they are good for the industry. The power ends up in fewer hands. On the other hand, venture capitalists and others need to cash out. In that respect, acquisitions have been good.'' Good and bad mean different things to buyers and sellers, according to Haig Bazoian, the former president of Xerox Computer Services who now heads his own Beverly Hills-based consultancy, Haig M. Bazoian Co. ``For the acquirer, strategic fit added something to their business that gave them some edge over the competition,'' Bazoian says. ``From the acquirees' point of view, they would gain investment monies or some sort of distribution for their product line or new markets.'' For users, of course, concerns are different: Are the clients going to get better customer service? Are they getting a better product? ``These are the kind of comments I hear about Computer Associates these days,'' Bazoian says. ``I think everybody's concerned that they are basically harvesting everything _ they don't care about the customer. They are slowly pulling everything they can out; they are not really putting funds into good quality and high customer service. That's certainly the image. My sense is that in the next two years, Computer Associates will respond to these strategic weaknesses and look at them as opportunities.'' Like birds and bees Few industry figures have had more contact with merger and acquisition activity than Bob Weissman, chief executive officer of Dun & Bradsteet. D&B has completed more than 60 industry deals in the last five years, and Weissman also subscribes to the ``they're a fact of life'' philosophy. ``The acquisition and merger process is a natural adjunct to the process of the development of any fast-growing market,'' Weissman says. ``Acquisitions and mergers are created by lots of entrepreneurial opportunities _ the early failure of most of them, the development of the rest and then the process of consolidation. This helps develop a critical mass of skills and financial resources.'' In terms of the process' impact on customers, Weissman notes, ``It's kind of ethically neutral. There is nothing inherent in the act itself which leads to good or bad things.'' Whether we like them or not, acquisitions and mergers are here to stay. They'll be with us until there are two firms left in the universe whose last gasp is, ``Do you want to do a deal?'' And then there will be one firm left. By Charles Varga; Varga, a 20-year computer industry veteran based in Frechtown, N.J., is publisher of ``The Cerberus Report,'' a study of industry mergers and acquisitions <<<>>> Title : Honeywell Bull at a cross Author : CW Staff Source : CW Comm FileName: bully Date : Dec 19, 1988 Text: BILLERICA, Mass. _ Honeywell Bull, Inc. is about to get a new name, but it is not going to tell what it is until a formal announcement in early 1989. It is anybody's guess what it will be called after Honeywell, Inc. decreases its ownership of the company to less than 20% and bows out of the title at the end of this year. But with major money committed to applications development, advertising and strong new product entries, Honeywell Bull executives said customers will be calling the company a contender. When Honeywell merged its computer business with Japan's NEC Corp. and France's Groupe Bull to form Honeywell Bull in 1986, it reserved the option to decrease its ownership participation by the end of 1988. Neither the corporation nor its customers are unprepared for the change. Raging Bull ``Basically, Bull has been running the show from the start,'' said Steven Milunovich, a large-systems industry analyst at First Boston Corp. That is nothing but good news for users, said Roland Kelley, MIS director at Tewksbury, Mass.-based supermarket chain Demoulas Supermarkets, Inc. ``Groupe Bull is definitely more committed to the MIS community than Honeywell was,'' Kelley said. Donald Bellomy, an analyst at Framingham, Mass.-based market research firm International Data Corp. (IDC), said the company's new incarnation represents more than a mere technicality. ``Lines of authority should be clearer,'' he said. Groupe Bull Chief Executive Officer Jacques Stern ``will emerge as the real CEO, with [Honeywell Bull President and CEO] Roland Pampel more of a local caretaker,'' Bellomy predicted. ``Certainly, the strategic direction will come much more from Paris than has been the case.'' The Gallic flavor might not suit every palate, Bellomy warned. ``Not only the presence of Bull, but the name Bull will be much more in evidence and, like it or not, there's a certain amount of xenophobia in the industry,'' he said. Honeywell Bull expects the economic realities of the global economy to rule the minds of customers, whether current or potential, according to Marketing Communications Vice-President David Dotlich. ``Bull isn't a foreign company _ it's an international company,'' Dotlich said. With revenue of approximately $6.6 billion, it is among the largest in the world. The Bull connection should not threaten U.S. customers' patriotism, Dotlich said; it should help them hedge their bottom-line bets. ``They're reassured that we're part of such a large, worldwide concern,'' he said. The question likely to linger in users' minds after Jan. 1, Bellomy said, is, `` `Will the company make it?' And that's no new question for Honeywell customers.'' The answer is likely to center around three words: compatibility, applications and marketing, said Curt Beaumont, IDC's director of systems and peripherals. These were the focus of collective concerns expressed at the October annual meeting of worldwide MIS directors of one leading Honeywell Bull customer, General Electric Co., according to Beaumont. Lean on me Noting an increasing leaning toward IBM among the GE MIS contingent, Beaumont said many of them think that Honeywell Bull has excellent on-line transaction processing (OLTP) capability _ its recently unveiled Titan, or DPS 9000, is claimed as the fastest OLTP performer on the market _ but is nevertheless unsure that the company will be able to serve its overall corporate needs in coming years. Honeywell Bull has its own three words to allay such doubts, Dotlich said: ``Just watch us!'' While he declined to spell out the strategy that is going into place as the company ends its two-year transition period, he outlined just what it is users should watch for in 1989. ``Heavy funding for new application development, both internal and through outside alliances, and a very aggressive advertising campaign,'' he said. By Nell Margolis, CW staff <<<>>> Title : National Semi sells Datac Author : CW Staff Source : CW Comm FileName: data1 Date : Dec 19, 1988 Text: SANTA CLARA, Calif. _ National Semiconductor Corp., which is rumored to have its National Advanced Systems (NAS) mainframe business on the block, announced last week it is selling off another piece of its business, Datachecker Systems, Inc., to Great Britain's largest computer firm. The new owner, ICL, is the mainframe and information systems subsidiary of London-based telecommunications giant STC PLC. ICL is the leading purveyor of retail point-of-sale (POS) systems in Europe, Australia and Japan, according to a company spokesman. Separately, National Semi reported a $25.2 million loss in the quarter ended Nov. 27 but said both Datachecker and NAS showed strong growth in the quarter. It blamed the loss on weak chip sales. Datachecker will cost ICL $90 million. It has an installed base worth $1 billion, according to the ICL spokesman. National Semi has not earmarked the income for any particular project and would not say whether the sale will make it any easier for the firm to hang on to NAS. Apparently, the company was having difficulty getting Datachecker enough money to grow the business. Datachecker would only expand ``if it was with an organization that was focused on the retail business,'' a spokeswoman said. ICL would not comment on whether it will keep Datachecker headquarters in Santa Clara or on whether any layoffs are planned. By J.A. Savage, CW staff <<<>>> Title : VAX 8600s going out to pa Author : CW Staff Source : CW Comm FileName: 1decdrop Date : Dec 19, 1988 Text: MAYNARD, Mass. _ Digital Equipment Corp. will begin gradually phasing out its VAX 8600 and 8650 systems next year when the firm rolls the machines and their options over into a ``maintenance-only'' status, DEC has confirmed. Final orders for the models _ which were once the plums of the VAX line _ will be taken May 19, and the last 8600s and 8650s will go out the door June 30, said Ken Donaghue, DEC's public relations manager for high-performance systems. Customers will be notified by mail by the end of this month. Orders received after that time for the 4.4 million instructions per second 8600 or 6.8- MIPS 8650 will be filled by refurbished systems on a first-come, first-served basis. ``The machines have been very successful, but as new products with better technology are introduced, we feel it makes more sense to migrate our customers in that direction,'' Donaghue said. Although DEC has publicly acknowledged the move, no 8600 and 8650 users contacted by Computerworld had heard of the development. While most expressed disappointment, none were surprised. ``I don't disagree with what they're doing; I just wish they'd give us faster [lower end VAX] 6200s before they did it,'' said David Renaud, director of technical services at Grinnell Mutual Reinsurance Co. in Grinnell, Iowa, which uses two 8650s and three 6200s. ``The [high-end VAX] 8800 series is just too expensive for us to justify going to.'' Others noted that they first heard the death knell for the two models when DEC began to hype the dual-bus VAXBI architecture of their newer models over the 8600's and 8650's Unibus-based system. ``Even when we started leasing our 8600, we knew it wasn't the latest and greatest technology,'' another user said. ``Besides, I can't get mad at DEC for changing products. I like to see a vendor stay up to date.'' There are more than 5,300 VAX 8600 and 8650 systems installed in the U.S., according to Computer Intelligence, a La Jolla, Calif.-based research firm. Closed gap When the 8600 was announced in late 1984, DEC clearly needed to play catch-up in its long-standing superminicomputer war with rivals like Data General Corp. and Prime Computer, Inc. The machine went a long way in closing the price/performance gap, said sources familiar with the VAX line. When the 8650 was announced a year later, DEC claimed the machine could be clustered to match the performance of IBM's largest mainframes. But the two models never had much of a shelf life. The slow decline of the 8600 and 8650 began as long ago as early 1987, when the word drifted out of Maynard that DEC would not promote the old models as aggressively as the newer VAX 8530 and 8550 because the newer models offered better price/performance ratios. The 8600 line has also come under low-end pressure from the Microvax 3600, which offers a 3-MIPS performance rate and is less than half the price of the 8600. Additionally, other vendors have turned up the heat on the 8600 and 8650. The 8600 and 8650 ``are going head-to-head with Hewlett-Packard and losing very badly because they're so much more expensive per MIPS,'' said John Logan, executive vice-president with Boston-based Aberdeen Group, a market research firm. ``The machines have simply reached the end of the line.'' While the 8600 and 8650 slip away next year, DEC is expected to announce a series of workstations in 1989, beginning with the so-called single-user PVAX system Jan. 10 [CW, Dec. 5]. By James Daly, CW staff <<<>>> Title : Was breaking up so hard t Author : CW Staff Source : CW Comm FileName: 1breakup Date : Dec 19, 1988 Text: When AT&T's Bell System monopoly was dismembered five years ago, the conventional wisdom was that AT&T would soon rival IBM as a global computer powerhouse and the divested ``Baby Bells'' would be left with nothing but the dregs of telephone service. Instead, AT&T has struggled in the computer business, and the seven regional holding companies have turned into highly profitable giants. Indeed, each of the seven holding companies _ Ameritech, Bell Atlantic Corp., Bellsouth Corp., Nynex Corp., Pacific Telesis Group, Southwestern Bell Corp. and US West _ has operating revenues exceeding $7 billion, ranking AT&T and all but one of the regional holding companies among the 100 largest corporations in the world. Postdivestiture America has also seen a few surprises. For example, the Justice Department in 1987 urged the court to relax several of the settlement's restrictions on the regional holding companies _ restrictions that the Justice Department had drafted in 1982. In another turnabout, the Bell companies are now engaged in a high-powered lobbying campaign for legislation to wipe out the court restrictions on their business ventures. In contrast, just prior to divestiture, executives of the Bell operating companies told Congress that the court-approved settlement was a good deal. Divestiture also gave viability to firms such as U.S. Sprint Communications Co., MCI Communications Corp. and a host of smaller interexchange carriers, analysts said. The emergence of competitors to AT&T in the long-distance market has led to a vigorous battle for customers in the Fortune 500 and a related effort to build digital, fiber-optic networks to attract those sophisticated users, according to Henry Geller, director of the Washington Center for Public Policy Research. Sprint's huge investment in a fiber-optic network ``forced AT&T to modernize much more rapidly than they otherwise would,'' said Herschel Shosteck, a telecommunications consultant in Silver Spring, Md. Nevertheless, AT&T's share of the long-distance market has declined from about 84% in 1984 to 68.8% in mid-1988, according to Federal Communications Commission data. Another result of divestiture was that it allowed AT&T to enter the computer business. AT&T entered the computer field in 1984 with its 3B2 line of minicomputers and several personal computers. But it fared poorly and in 1986 alone reportedly lost $1.2 billion from its computer operations. While AT&T's computer fortunes are beginning to look up, ``they've gone down a painful learning curve and are beginning to emerge from that,'' said Peter Winder, vice-president of the San Francisco Consulting Group, Inc., a telecommunications consulting firm. The regional holding companies experienced a similarly painful learning curve with their efforts to diversify into unregulated businesses, Winder said. After a few fiascos, such as Ameritech's ill-fated purchase of software vendor Applied Data Research, Inc., the Bell companies are maturing and sticking closer to telecommunications, he added. Meanwhile, the Bell holding companies are lobbying to free themselves from the Modified Final Judgment's prohibition on entering the long-distance, manufacturing and information service businesses. In the first triennial review of the decree, Judge Greene loosened the restriction on information services to allow the Bell companies to provide transmission and gateway services, but the Bell companies are clamoring for more freedom [CW, Nov. 21]. The major thrust of the lobbying effort is to get Congress to either relax some of the restrictions or remove administration of the decree from Greene and hand it over to the deregulation-minded FCC. ``But I think that effort will fail,'' said Alan Pearce, president of Information Age Economics, Inc., a telecommunications research firm in Bethesda, Md. ``A bill may be introduced in the 101st Congress, but I would bet money that it won't pass.'' The mood on Capitol Hill, Pearce said, is to wait for the second triennial review in 1990; then, if displeased with the outcome, Congress may act. For the second triennial review, the Justice Department is scheduled to file its report Jan. 1, 1990; Greene will then consider the recommended changes and respond in late 1990 or early 1991, Pearce said. However, that timetable may be thrown off, because the Justice Department reportedly is seeking a delay so it can file its report in September 1990. Pearce predicted Greene will lift the restriction on equipment manufacturing, which will release the pressure for legislation, perhaps loosen the restriction on information services and retain the ban on long-distance service. Because the Bell companies still have monopoly control over the local exchange, they could dominate short-mileage markets and drive long-distance companies out of business, Pearce explained: ``The long-distance restriction will be the last to fall.'' For this reason, in his recent speech, Greene again put the Bell companies on notice that he will not remove any of the remaining restrictions while the companies have monopoly control over the local network. Greene said the telecommunications industry has a bright future ahead so long as it is permitted to operate without domination by monopolies. ``As far as my court is concerned, that is precisely what our course will be _ steady as she goes.'' By Mitch Betts, CW staff <<<>>> Title : Big overhaul set at Prime Author : CW Staff Source : CW Comm FileName: 1preorg Date : Dec 19, 1988 Text: NATICK, Mass. _ Prime Computer, Inc. will kick off the new year with a sweeping sales reorganization that could help deter hostile acquisition attempts such as the one currently being carried out by MAI Basic Four, Inc., sources within Prime told Computerworld last week. Under the new game plan, Prime will fold its Prime and Computervision divisions into one fully integrated company split into three geographically organized sales units. Michael Forster, currently heading up general marketing at Prime, will head the U.S. sales unit; the Europe, Africa and Middle East unit and the Far East unit will be headed on an interim basis by Prime Chief Executive Officer Anthony Craig. According to a highly placed Prime source, the plan does not contemplate dropping or phasing out any product line. Layoffs, however _ possibly a great number, and probably concentrated at the middle level of the corporate reporting structure _ are not being ruled out. ``This looks like a very major move,'' said Charles Foundyller, president of Cambridge, Mass.-based market research firm Daratech, Inc. He added that Prime users stand to benefit from the reorganization in that a leaner, meaner Prime could mean ``a lot fewer meetings and a lot more action.'' He said the reorganization, while creating a more efficient corporate structure, will also reshape Computervision into an entity less capable of being quickly sold off in the wake of a takeover. In the weeks since Tustin, Calif.-based MAI made its hostile bid for Prime, industry speculation has been rife that MAI Chairman Bennett LeBow, well-known as a so-called ``asset player,'' intended just such a quick sale of Computervision, a major force in the computer-aided design and manufacturing market. Recent unequivocal denials of such intent by LeBow and partner William Weksel have failed to end the speculation. LeBow et al are wooing Prime shareholders with promises of a markedly more efficient and productive Prime under MAI ownership. Prime's own reorganization, Foundyller noted, augurs the same. ``The only thing LeBow knows how to do is go in there with a cleaver,'' he said. ``If Craig can wield a cleaver of his own, stockholders might reason, who needs LeBow?'' Prime 1; MAI 0 In related news, Prime last week won a first-round victory in the legal skirmishes with MAI spawned by the takeover attempt. The U.S. District Court for the Massachusetts district granted a preliminary injunction that bars MAI from continuing their tender offer until certain disclosures have been made to Prime's stockholders. Among the topics on which shareholders are entitled to further detail according to the Court are MAI's post-merger plans for Prime; certain prior claims of federal securities law violations against LeBow and Weksel; and the convoluted relationships among MAI, a number of limited partnerships affiliated with Drexel, and Drexel itself _ found by the court to be a ``bidder'' for Prime within the meaning of the law. By Nell Margolis, CW staff <<<>>> Title : Chips coming in Author : CW Staff Source : CW Comm FileName: newchip Date : Dec 19, 1988 Text: SAN FRANCISCO _ Swifter, stronger _ and smaller. This is the outlook for the chips of the 1990s, according to recent breakthroughs in semiconductor technology that were announced last week by commercial vendors at the International Electron Devices meeting here. While Texas Instruments, Inc.'s quantum-effect transistor has the most revolutionary potential, commercial users are likely to see tangible benefits much sooner from Fujitsu Ltd.'s 64M-bit dynamic random-access memory chip, which is under development. Fujitsu said it hopes to create a prototype of the world's most dense memory chip within five years; then it will begin sample production. The 64M-bit technology will have immediate benefits for commercial users because it can be integrated into computers as currently designed. ``You don't have to redesign systems in order to take advantage of more memory,'' said Andrew Rappaport, president of The Technology Research Group, Inc., a semiconductor research firm based in Boston. ``Many current commercial applications, like solids modeling, are currently constrained by how much data you can put in memory.'' A decade off Practical applications from TI's breakthrough, in contrast, are about 10 years away, according to the company. But the implications of the technology are staggering: quantum devices 100 times smaller and 1,000 times faster than conventional transistors. The technology is based on the quantum mechanical, rather than electronic, movement of subatomic particles. The electrons behave like waves rather than particles, reacting to different energy levels on the chip. This eliminates the need for the tiny gates that control electron flow on conventional chips. In another announcement at the conference, IBM said it has developed the world's fastest CMOS chip. With a switching speed of 30 billion time/sec., IBM said the CMOS device could power today's IBM Personal System/2 to run at the speed of a current IBM mainframe. By Clinton Wilder, CW staff <<<>>> Title : Developer wins $2.2M in B Author : CW Staff Source : CW Comm FileName: bofa1 Date : Dec 19, 1988 Text: A San Francisco jury told Bankamerica Corp. last week to give software developer Gulab Tinmahan a nice Christmas present _ $2.25 million. The jury found that the bank ``wrongfully exerted control'' over Tinmahan's software and that its actions were ``oppressive.'' Ironically, the one juror with technical experience, a programmer, was the sole member to vote in favor of the bank, according to Judge Frank Shaw. Tinmahan complained in 1984 to San Francisco Superior Court that Bankamerica forced him to sign a licensing agreement in which he received no licensing fees, destroyed his source code _ rendering his software unusable _ attempted to frame him by electronically transferring funds in his name and destroyed his career. The bank had argued that because Tinmahan was in its employ, the software belonged to the bank; it said the code was obsolete anyway because it was written for the IBM precursor to the System/34. Tinmahan worked for the bank from 1976 to 1983 as a consultant _ a well-paid one at that, he said: ``They paid me $3,000 a month just to carry a pager.'' His fancy lifestyle ended abruptly in April 1983, however, when the bank barred him from the premises by ``putting two security guards on me,'' he said. Tinmahan said he lost his house, car and ability to pursue his work and felt so threatened that he got a license to carry a concealed weapon. He said that he had bailed out the bank in 1980 when it was having software problems with its remittance accountant procedure system. He let the bank know that he had developed software that could help correct the problems. In court filings, Tinmahan alleged his software allowed the bank to computerize interbank transfer collections and perform other functions it was unable to do before using his package. ``At minimum, we will file a motion for a new trial,'' said the bank's attorney, Frank Sommers. By J.A. Savage, CW staff <<<>>> Title : NSA plays it safe, signs Author : CW Staff Source : CW Comm FileName: smart Date : Dec 19, 1988 Text: The recent spate of well-publicized hacker attacks on computer sites throughout the nation is prompting some organizations, including the U.S. government, to get smart. Last week, the National Security Agency (NSA) said it signed contracts with three companies for a security device that uses smart card technology and was designed to protect government computers containing unclassified information against hacker attacks. The awarding of the contracts comes on the heels of several security breaches on the nationwide Internet computer network in November and December. The network links computers at corporations, universities and defense installations. The Computer Virus Industry Association, a Santa Clara, Calif., trade group, has pegged the damage wrought by last month's worm attack on the Internet network at a staggering $98 million worth of lost machine time and manpower required to eradicate the program. The group has calculated that about 6,200 host computers were hit by the worm. What's in the cards A smart card is a plastic card about the same size as a credit card with a microprocessor embedded beneath its surface. The card can be used to store an infinite variety of personal data _ the user's birth date, mother's maiden name and the like _ as well as lengthy logon sequences and security data that can limit the user's access to some databases. The card's database can equal several pages of text. The NSA contracts were awarded to ACS Communications Systems, Inc. in Herndon, Va., Interstate Electronics Corp. and Codercard, Inc. in Anaheim, Calif., and Pailen-Johnson Associates, Inc. in Vienna, Va. The three companies will build low-cost encryption/authentication devices, or LEADs, which will be marketed to U.S. government departments, agencies and certain contractors for about $100 each, according to the NSA. LEADs, which can be operated only with a smart card, will be used to screen out unauthorized computer users and encrypt data transmitted on the Defense Data Network, an NSA spokesperson said. In a trial now under way, AT&T is testing its own smart card technology to help protect its computer systems against unwanted intrusion by hackers. To access the computer system, the user inserts the card into a reader attached to his terminal. The host system uses a complex code to validate the smart card and then asks for a password. By Michael Alexander, CW staff <<<>>> Title : Wang rolls out Pcs, 800 l Author : CW Staff Source : CW Comm FileName: wangpc Date : Dec 19, 1988 Text: LOWELL, Mass. _ Wang Laboratories, Inc. last week inaugurated a campaign to capture a noticeable share of the PC market with the establishment of a direct-response Microsystems Marketing Group and the addition of two models at the high end of its Professional Computer 200 and 300 series. Slated for shipment this month, the Intel Corp. 80386-based Professional Computer 381 and 382 are both compatible with the IBM Personal Computer AT. Aimed at the general-business market, the 16-MHz 381 will be priced from $3,195 _ a 12% savings over Wang's current Professional Computer 380 for a typical configuration, according to a spokeswoman. The 20-MHz Professional Computer 382, priced from $3,450, is targeted at graphics and communications-intensive applications. Marketing maneuvers Calling the new entries ``good, solid products and ones that Wang really needs,'' marketing programs manager Valerie O'Connell said that what is really strategic is the formation of the new marketing division. With the overwhelming number of current personal computer sales going to Wang VS minicomputer installations, the new division ``was chartered to truly focus on getting Wang some presence in the PC market at large,'' O'Connell said. The new group's maiden products, the PC 381 and 382, both support a variety of operating systems, including The Santa Cruz Operation's Xenix System V, Release 2.3 and Microsoft Corp.'s MS-DOS 3.3 and MS OS/2. According to O'Connell, each machine can either run stand-alone applications or function as a node on either an 802.3 or 802.5 PC local-area network. Suitably configured with the right software, either system can also serve as a Wang VS minicomputer workstation, according to the company. By Nell Margolis, CW staff <<<>>> Title : NAS freezes Unix pact wit Author : CW Staff Source : CW Comm FileName: nas1 Date : Dec 19, 1988 Text: ATLANTA _ The feud in the Unix market has claimed one casualty. National Advanced Systems (NAS) has put on hold its plans to develop a mainframe version of the Unix-based Sun Microsystems, Inc. operating system. NAS recently sent pink slips to its 30 employees in the System Software Development Center here. That group was to port the Sun operating system _ based on AT&T's Unix System V, Release 3 _ by late 1989. NAS and Sun entered into a joint development agreement in July 1987, and, technically, the agreement remains alive. A NAS spokesman declined to comment on any penalties NAS may incur in modifying its relationship with Sun. A NAS spokesman said the company is concentrating its development efforts on AIX, IBM's version of Unix. IBM-compatibility is considered crucial to NAS' survival. The company maintains that closing down its Atlanta operation has nothing to do with persistent rumors that National Semiconductor Corp. intends to sell off NAS. IBM's AIX has been selected as the kernel for the Open Software Foundation's (OSF) version of Unix. The OSF, led by IBM, Hewlett-Packard Co. and Digital Equipment Corp., was formed earlier this year to counter the joint development efforts of Sun and AT&T _ which owns Unix. OSF officials have charged that Sun _ the market share leader in the technical workstation sector _ would receive an unfair competitive advantage as a partner in Unix development. Working away AT&T and Sun are working feverishly to develop Unix System, Release 4, which is expected to be available by mid-1989. The OSF, which is in the process of selecting a user interface, is expected to release its version of Unix by 1990. NAS will support any standard established by the OSF, the spokesman said. A Sun spokesman expressed disappointment at NAS' decision. ``We're pretty far along in the development work; we'd like to find another partner,'' the spokesman said. Ironically, Sun and AT&T have recently staged a number of public events to demonstrate support for Unix System V, Release 4 among users and third-party Unix developers. The spokesman downplayed the impact of NAS' reversal. ``All it does is slow down the emergence of a mainframe version of the Sun operating system,'' he maintained. ``It will have no other major impact.'' NAS is now forced to play catch-up with its chief rivals Amdahl Corp. and IBM. William Bonin, vice-president of North America operations for Unix industry X/Open Consortium Ltd. said the decision is a big win for Amdahl. ``Amdahl has based a big chunk of its strategy on Unix,'' Bonin said. ``Now, its only competitor is IBM.'' Bonin said he does not expect the loss of NAS to hamper Sun and AT&T. ``They're not a player in Unix,'' he explained. ``If Unisys dropped out, that would be a big deal.'' Amdahl offers its own version of Unix called UTS, based on the Sun operating system. This places NAS at a greater disadvantage to rivals Amdahl and IBM. By Julie Pitta and J.A. Savage, CW staff <<<>>> Title : New shorts Author : CW Staff Source : CW Comm FileName: short121 Date : Dec 19, 1988 Text: DEC RISC shop scuttled Digital Equipment Corp. earlier this month canceled plans to purchase a parcel of Snohomish County, Wash., farmland to serve as a site for development of its own reduced instruction set computing (RISC) technology. The site was to provide a research and manufacturing facility for its Decwest Engineering Group. Citing a change in strategy, Decwest Engineering Group Manager Roger Heinen issued a statement reiterating the company's commitment to the state of Washington. Analysts said DEC's home-grown RISC development has been curtailed. On the other hand, the first fruits of DEC's recently announced alliance with RISC-master MIPS Computer Systems, Inc. are expected in early 1989. IBM, M&D ink marketing pact McCormack & Dodge Corp., based in Natick, Mass., has signed an agreement with IBM to jointly market M&D financial and human resource applications on all IBM processors. The pact increases the ties between the two companies and allows M&D to receive IBM marketing support and attend IBM user conferences. The agreement is the successor to the IBM Industry Marketing Assistance Program pacts and allows IBM to designate M&D as an Authorized Industry Application Specialist . Developers put Openview to work Hewlett-Packard Co. said several major vendors are developing network management applications using the HP Openview user interface. They include Fibercom, Inc. in Roanoke, Va., Ungermann-Bass, Inc. in Santa Clara, Calif., Jutland Telephone Co. in Denmark, Microtronix Systems Ltd. in Ontario, Canada, and Telindus in Brussels. Openview is based on Microsoft Corp.'s Windows and provides network managers with a standard user interface. Australian hackers face 10 years Convicted computer hackers may face up to 10 years in jail under federal legislation proposals from an Australian government committee. The committee has rejected advice from the office of the Director of Public Prosecutions and recommended special federal legislation to deal with intrusive hacking and other computer crimes. The Public Prosecutions office had taken a stand that legislation to deal with computer crime was not warranted and too difficult to write. Telegraphing EDI Western Union Corp., based in Upper Saddle River, N.J., last week introduced its electronic data interchange (EDI) service, tightly integrated with its Easylink electronic mail and packet-switching network. Western Union EDI supports ANSI X.12 and other common EDI standards, while EDI-related messages to trading partners can be sent by E-mail, the vendor said at the TDCC _ Electronic Data Interchange Association's annual conference. Big money backs Telesoft retool Telesoft Co., a major Ada software vendor in San Diego, said it received $8.5 million from investors for a research and development program to enhance its Ada compilers and develop computer-aided software engineering tools. The investors are Paine Webber Development Corp. and Swedish Telecom. A Paine Webber official said the investment will help Telesoft expand its share of the Ada market and move into commercial MIS and on-line transaction processing applications. DEC to provide worldwide EDI DEC announced last week its intent to provide EDI software and systems-integration services on a worldwide basis at the TDCC _ Electronic Data Interchange Association's annual conference in Washington, D.C. DEC, which introduced VAX/ EDI software in the UK in July, will support the X.12, Edifact and ISO X.400 standards, officials said. ET <<<>>> Title : Rolm deal: New PBX strate Author : CW Staff Source : CW Comm FileName: rolm4 Date : Dec 19, 1988 Text: IBM's intended sale of its Rolm Systems Division manufacturing and development operations to Siemens AG is a signal that private branch exchange (PBX) technology will not serve as the once-anticipated linchpin in voice and data integration. That was the consensus opinion of industry watchers about last week's announcement, in which the two companies also agreed to create a marketing and services company called Rolm Co. This company will sell telecommunications products for private networks in the U.S. Meanwhile, Rolm's U.S. customers wondered what the announcement means to them. With $16 million recently invested in an IBM-Rolm telecommunications system, Neil Sachnoff, director of information services support at Columbia University in New York, said the university is ``cautiously optimistic'' about the effects of the agreement. But he expressed curiosity about how the marketing division of Rolm Co. will handle being the vendor representative of Rolm, Siemens, NEC Corp. and GTE Communications Systems equipment. Rolm user Hank Backofen, manager of corporate telecommunications at Prime Computer, Inc., said the agreement will not have much impact on Prime, which is used to dealing with separate sales, maintenance and sales support departments at IBM. According to Fred Chanowski, president of Telecommunications Management Corp., the linchpin role in voice and data integration at IBM has now become the province of higher level networking. He said it is implemented in T1, transmission services, wire and fiber-optic media. The agreement _ which includes the establishment of IBM's Rolm Systems Division as a Siemens development and manufacturing subsidiary and IBM's marketing of a Siemens PBX in Europe _ will allow IBM to sell a PBX in its systems integration business without having to brook the characteristically low return on investment encountered in PBX manufacturing. Further, IBM has gained access through Siemens to what it lacked most: the highest end of communications technology, said Harvey L. Poppel, a partner at Broadview Associates, a Fort Lee, N.J., investment banking firm. Frank Dzubeck, president of Communications Network Architects, Inc. in Washington, D.C., said that the agreement prepares IBM to take advantage of the 1992 pan-Europe agreement that will create a boom in the communications marketplace. ``IBM didn't have a European-based PBX to greet the anticipated growth in the market,'' he said. ``IBM will be able to sell Siemens European products and merchandise to large enterprises based on specific opportunities.'' In turn, Siemens garners credibility in the U.S. from its association with IBM, as well as the time to integrate operations and penetrate the Rolm customer base, Chanowski said. IBM will also phase out its 8750 PBX systems slated for the European market. By Robert Moran, CW staff <<<>>> Title : Novell releases a hodgepo Author : CW Staff Source : CW Comm FileName: novell2 Date : Dec 19, 1988 Text: NEW YORK _ Novell, Inc. last week unwrapped a hodgepodge of announcements that analysts said did as much to confuse the company's 1989 plans as to clarify them. Beyond positioning statements, much of the session was devoted to shadowy descriptions of the company's future product directions. Novell's 1989 file server strategy includes an obtusely named ``value-added services platform for applications development'' and features the following: Netware 386, an Intel Corp. 80386-based server that reportedly will provide at least three times the performance of the company's Intel 80286-based server. Server products under development for two standard client-server protocols: Sun Microsystems, Inc.'s Network File System and IBM's Server Message Block. This move reportedly will allow users of OS/2 Extended Edition and Unix-based workstations to be able to run on Netware networks in their native environments. Support for LAN Manager clients via IBM's Netbeui/Data Link Control protocol. Support for host-based servers that would run on top of a general-purpose operating system, such as Netware for Digital Equipment Corp.'s VMS, which shipped in September. Netware for OS/2 is scheduled to ship in 1989. Mark Caulkins, vice-president of Novell's software group, conceded that IBM's MVS and VM systems are logical candidates for support. Novell also promised to support and encourage distributed application development using such vehicles as Microsoft Corp.'s Named Pipes application programming interface and IBM's Advanced Program-to-Program Communications. First-quarter availability is planned. By Patricia Keefe, CW staff <<<>>> Title : EDI eases toy order rush Author : Elisabeth Horwit Source : CW Comm FileName: matside Date : Dec 19, 1988 Text: This holiday season is also the first in which Mattel has used electronic data interchange (EDI) to exchange documents with key retail chains. The company currently exchanges purchase orders and invoices electronically with most of its major distributors, including Wal-mart Stores, Inc., K Mart Corp., Sears, Roebuck and Co., Toys R Us, Inc., Child World, Inc. and Service Merchandise Co. Mattel began setting up EDI links with its retailers in 1987 and is now trying to bring a new document on-line with one of its trading partners each month, according to the company's EDI coordinator, Roy Fazalare. While ``you can figure out paper costs for a rough handle on savings,'' the real paybacks of EDI are intangibles such as ``dollar float and service to your customer,'' he added. EDI can form the basis for ``quick response, the retail industry's equivalent of just in time,'' Fazalare said. The company is currently looking at EDI links as a way to provide point-of-sale information, according to Jeff Harris, director of information technology at Mattel. Right now, the company gets sales data from selected stores every two weeks. With EDI, ``we would know about today's sales tomorrow, and our forecasts would keep improving,'' Harris said. GE Information Services, a division of General Electric Co., acts as an electronic post office where the documents can be picked up. EDI also offers major benefits to the retailers, said Joe Franzino, an assistant to Service Merchandise's EDI administrator. The links have cut the time it takes to process a purchase order or change from five to 10 days down to two days, he said. ``So, we have faster turnover of sales items, and better customer service.'' ELISABETH HORWITT <<<>>> Title : The new on-line world of Author : CW Staff Source : CW Comm FileName: xmas2 Date : Dec 19, 1988 Text: BRISBANE, Calif. _ Santa's workshop has computers. In a shimmery white building on the shores of the San Francisco Bay, a group of 300-plus would-be elves spend months busily designing stuffed animals that will find a home with some lucky boy or girl at Christmas. The number of Christmas wishes has become so great that the elves have abandoned the pencil and pad, on which they once kept their lists, in favor of computers. The workshop has three mainframes _ two Unisys Corp. V380s store production and accounting information, and a V310 is used for software development purposes. Connected to those mainframes are 150 terminals. Bill the Cat vs. Garfield Santa's helpers, just like everyone else, are interested in efficiency. And since the computers came in, Santa rarely finds himself with too many Bill the Cats when what the kids really want is Garfield. The white building is the headquarters of Dakin, Inc., a manufacturer of stuffed animals and other gift products. If you could ignore the warm sunshine, it would be easy to mistake Dakin's headquarters for Santa's workshop. The glass-and-stucco building has a magical quality as it sparkles in the sunlight. ``It's been called the crystal palace,'' according to Pete Bardea, Dakin's director of information systems. Dakin's lobby opens into a showroom of its products that range from decorator items such as frames and vases to children's clothing. But most of the exhibition area is devoted to Dakin's core business _ stuffed animals. Bardea's information systems group keeps track of the diverse menagerie. Its mainframes maintain information on the company's 70,000 accounts. Bardea estimates that during the year, MIS will have processed about 150,000 orders. It also keeps track of 2,000 types of products in Dakin's inventory. Computers monitor the travels of Dakin's pets as they shuttle from production plants in the Far East to resting spots at warehouses in California, Illinois and New York and eventually to international customers. This season's challenge is keeping the stores stocked with enough Garfield's Stuck on Yous, a stuffed animal version of the comic-strip character Garfield with suction cups on its paws. The product has been phenomenally popular, taking even Dakin by surprise. ``It's impossible to predict what'll be hot,'' Bardea says. At the time of introduction, Dakin had 12,000 Stuck on Yous ready. Initial orders were placed for 40,000. <<<>>> Title : Mattell net chases Xmas b Author : CW Staff Source : CW Comm FileName: mattell Date : Dec 19, 1988 Text: HAWTHORNE, Calif. _ Letters to Santa Claus notwithstanding, Mattel, Inc. has always had to struggle to get its toys under the right Christmas tree each year _ without having a lot of unsold Barbies and Hot Wheels sitting in inventory on Christmas day. This holiday season has been different, however. Just in time for the Christmas rush, the toy company installed the last data link of an international networking system that is expected to substantially reduce inventory stock levels and production-schedule lead time (see chart) and save Mattel approximately $540,000 per year in worldwide communications costs. Before the network came on-line, communications between Mattel's data center here and plants and distribution warehouses around the world were haphazard at best. Too stiff As a result, the toy maker lacked the flexibility to shift production schedules to meet updated sales forecasts on an ongoing basis. Mattel was also unable to match uncommitted stock in one warehouse with unfilled orders in another because it had no centralized database to keep track of inventory levels around the world. ``It took so long to find out what was available, you almost didn't ask,'' said Jeff Harris, Mattel's director of information technology. ``If you ordered it, and it was available, they shipped it.'' This situation became critical during the period between late September and mid-December, when the toy industry does approximately 60% of its business, Harris said. Mattel would start building up its inventory in the summer on the basis of long-range sales projections. But with no way to effectively track inventory movement across all of its warehouses, the company could not accurately update its forecasts once the shopping season started. Nor could it send out modified production schedules to its plants in a timely fashion. This lack of mobility contributed to the financial bath that Mattel took when sales for Masters of the Universe-related merchandise plunged from $300 million in 1983 to $40 million in 1984. Forecasting its 1984 sales on the basis of 1983 performance and unable to react quickly enough to changing market demand, the company ``wound up sitting around with an awful lot of Masters of the Universe,'' Harris said. In November 1987, Mattel's information systems department got approval to start building a companywide network to address these problems as well as an out-of-control communications budget. The network combined private terrestrial lines and very small-aperture terminal satellite dishes to support voice, data, telex and facsimile communications. Last October, a link to West Germany completed the data side of the network, which connects IBM Application System/400s and System/38s installed at Mattel headquarters, at worldwide distribution centers and at manufacturing plants in the Far East. Thus, a warehouse in West Germany may be able to fill an unexpected order for Barbies by requisitioning uncommitted stock in France or finished goods awaiting shipment in Hong Kong. Mattel now knows what finished goods are due from which plant on a daily basis and ``where inventory is in terms of our ability to move it,'' Harris said. Direct links between headquarters and overseas plants provide Mattel with ``better alignment between the production schedule, market forecast and real orders,'' Harris said. Also, engineers at Hawthorne can exchange drawings and changes with overseas plants overnight via electronic links. This shortens the time it takes for a concept to become a product and allows Mattel to bring out new toys in time to catch passing fads, Harris said. One step beyond Last May, Mattel Chief Executive Officer John Amerman praised the network as ``a joy to behold, a quantum leap forward,'' adding that Mattel could ``now manage inventory rather than react to situations as they occur.'' Mattel says it expects inventory turnaround cycles to drop from 30 days to one day as a result of the network. Another of the network's major goals is to allow the firm to control communications costs by providing the same level of network service to all its sites and work groups at a set monthly cost, Harris said. Under the old system, ``costs were very difficult to control and utilization hard to manage,'' Harris explained, because private lines were installed on an as-needed, project-by-project basis. The new network makes 9.6K bit/ sec. leased lines available throughout Mattel at set monthly rates. The complete system should be up and running by March. It is expected to reduce Mattel's 1988 communications expenses by $810,000. Inventories for this Christmas season ``are already down substantially,'' Harris said. While Harris refused to give the network full credit for these developments, the system has clearly helped make this Christmas a little merrier for Mattel. By Elisabeth Horwitt, CW staff <<<>>> Title : Top telecom jobs changed Author : CW Staff Source : CW Comm FileName: divestit Date : Dec 19, 1988 Text: The divestiture of AT&T has made communications managers' lives ``roughly analogous to parents of schizophrenic children,'' the leader of a business telecommunications group says. ``The up side is that some very creative things can be done; the downside of that is that [their jobs] have become extremely complex,'' said Kenneth Phillips, chairman of the Committee of Corporate Telecom Users and a vice-president of telecommunications policy at Citicorp. During the last five years, communications managers have had to become savvy, flexible and aggressive to chart a successful course in volatile postdivestiture waters. Among the influences that companies must now take into account when hammering out network strategy are ``the dynamics and subtleties'' of tax policy, depreciation and capital recovery as well as volatile regulatory and political environments, Phillips said. Computerworld asked several communications managers how divestiture has affected their jobs and their companies' communications strategies. The issues and concerns they talked about fall into three broad areas: relationships with carriers; the regulatory environment; and the communications marketplace. An increasing number of companies find their communications strategies have come full circle during the past five years. Major firms that sought independence from _ and bypass of _ carriers a few years ago are now looking for vendors to manage their networks for them. Taking the reins Right after the breakup, ``the end user had to grab more control'' of the network, ``because he no longer had mother to solve his problems for him,'' said Frank Dzubeck, president of Washington, D.C.-based consulting firm Communications Network Architects, Inc. But now that networks are becoming increasingly complex and are combining many vendors' services and equipment, ``You have to do your homework and know where everything is, be able to manage it and do your own fault isolation,'' said Dennis Murphy, director of telecommunications at Warner Communications, Inc. Warner has avoided this situation so far by sticking with a basic network but is now considering a more sophisticated solution that may well push it into the arms of ``someone like AT&T'' or a systems integrator that can take over the difficult job of managing the network, Murphy said. AT&T is attempting to lure corporations back into its lap through special deals such as Tariffs 12 and 15. Under Tariff 15, AT&T can provide price breaks to major customers; under Tariff 12, it can combine services, equipment and network management into tailored packages. Companies such as American Airlines, General Electric Co. and Ford Motor Co. have bought such packages. But some communications managers are nervous about giving back AT&T so much power, particularly because the Federal Communications Commission is still evaluating Tariffs 12 and 15. AT&T might tell top management, ``We'll save you $36 million a year off the top with no loss in service,'' said W. Edward Hodgson, a manager of computers and communications at Westinghouse Electric Corp., which is currently negotiating such a deal with the carrier. ``They take your network away from you, and then the FCC decides they have to charge what everyone else does, and you're stuck.'' Several communications managers echoed Hodgson's view that the FCC is not taking a strong enough line with AT&T and the divested Bell operating companies. ``The mood of the country is toward further deregulation. But when the right thing to do is known, it's sometimes better to have a regulated environment,'' said John Saccente, director of telecommunications at Tenneco, Inc. Going your own way Saccente and other users said they wanted an authority such as the FCC to ensure uniform availability and pricing of services offered by the Bell operating companies. Right now, ``each BOC is going its own way,'' Saccente said. Managers praised AT&T divestiture-related regulations that have opened up the long-distance market to competition, giving them a greater choice of providers and services and a stronger bargaining position. When it came to further deregulation of the Bell operating companies, however, Westinghouse's Hodgson expressed the general view, ``I would like to see them offer more [communications] services, such as voice mail, but I don't want them to spend my money to buy new businesses that have nothing to do with my communications needs.'' By Elisabeth Horwitt, CW staff <<<>>> Title : Battery charge Author : Nell Margolis Source : CW Comm FileName: 1219stoc Date : Dec 19, 1988 Text: Technology stocks, more battered than bettered of late, were showing signs of resurgence last week _ is this the holiday spirit, or what? Novell, Inc. dropped a point to 29 after warning that a hardware inventory writedown caused by a strategic shift toward software would depress fourth-quarter earnings. By Thursday, however, it picked up of a point to close at 30 . Glad tidings rang out from the semiconductor sector. Micron Technology, Inc. reported first-quarter sales up more than 100% over last year but down from fourth-quarter revenue because the company could not turn out dynamic random-access memory chips fast enough to fill the ever-escalating demand. Its stock closed up of a point Thursday, finishing at 15 . Intel Corp. picked up of a point to close at 22 ; LSI Logic Corp. ended the week at 9 , up ; and VLSI Technology, Inc. gained of a point, ending the week at 8 . Elsewhere in the industry, IBM closed Thursday at 121, up of a point. Digital Equipment Corp. fell 1 points to a Thursday close of 92. NELL MARGOLIS <<<>>> Title : AS/400 hits on all cylind Author : CW Staff Source : CW Comm FileName: asrev Date : Dec 19, 1988 Text: IBM recently sailed past the 25,000-unit mark for AS/400 shipments, a tremendous showing for the first three months of a product's life. And it looks like IBM is just warming up: ADM, Inc., a research firm, recently predicted that 7,000 more units may ship before the end of the year. In posting the early victory, IBM provided a textbook business-school lesson in sales acumen. The rapid deployment of Application System/400s was not only the result of providing a salivating base of System/36 and 38 customers with a long-desired power and memory upgrade but also was fueled by the AS/400's ability to steal customers away from other vendors. Additionally, the announcement provided a launchpad to position the AS/400 as a dominant force in the overall mid-range market. ``There's a tendency to think of [the AS/400] as a bigger or better System/38, something for just those folks,'' said David Andrews, president of Cheshire, Conn.-based ADM. ``But a lot of capabilities coming down the line _ artificial intelligence, optical storage, voice integration _ will be put on this platform.'' IBM has gotten off to a good start in achieving that goal, putting together a well-oiled machine before pushing the Start button. It carefully crafted the product introduction, making it a big enough news event to warrant network news coverage. It also kept the industry's awareness high with an advertising blitz. The results so far have been impressive and should remain healthy: Cambridge, Mass.-based Forrester Research, Inc. has said AS/400 shipments in 1989 should reach 38,000. IBM has also worked closely with users in designing the product, lining up hundreds of software developers before the announcement instead of presenting the industry with a near-softwareless system _ as the firm did with the 9370. Michael Stombaugh, DP manager at Remington Freight Lines, Inc. in Remington, Ind., claimed this as a primary reason for abandoning a Digital Equipment Corp. VAX-11/750 and purchasing an AS/400 Model B30.``DEC doesn't court our industry,'' he said, adding that IBM provided the dispatching-industry software that DEC lacked. Conquest sales IBM claims to have already captured several thousand users _ one-third of the 25,000 shipped _ from outside the System/36 and 38 community. But John Logan, executive vice-president at the Boston-based Aberdeen Group research firm, cautioned that the 8,000 sales do not strictly represent IBM's marketing power. Logan said more than half those sales came from users who were ready and waiting for a new IBM mid-range machine anyway. Michael Holycross, DP manager at Acusport Corp. in Bellefontaine, Ohio, managed an HP 300048 but converted to an AS/400 Model B50 last month. ``HP had their newer machines coming out, but we were skeptical,'' Holycross said. ``We're looking into the future and felt that IBM had more to offer than HP.'' Building momentum among non-System/36 or 38 users is critical for IBM, because by mid-1989 these ready-and-waiting users should be satisfied and the real sales effort must begin in earnest, Logan said. By Rosemary Hamilton and James Daly, CW staff <<<>>> Title : DEC to mix systems Author : CW Staff Source : CW Comm FileName: rdbvms Date : Dec 19, 1988 Text: For once, IBM has it now and DEC is moving to offer the same. Digital Equipment Corp. is working to integrate its RDB relational database with its VMS operating system, according to Victoria Farrell, DEC's database marketing manager. In tying its proprietary relational database management system to its proprietary operating system, DEC is following IBM's lead, imitating IBM's Application System/400, which includes a built-in relational database management system with its operating system. The strategy is also parallel to that of IBM in linking its DB2 DBMS with its Enterprise Systems Architecture. The move could change the rules of the game in the VAX software market. Following footsteps ``DEC could do with RDB what IBM did with DB2,'' said John Birch, director of mid-range development at McCormack & Dodge Corp., an application software firm in Natick, Mass. ``It allows you to make changes to other products dependent on RDB,'' Birch said. ``It allows you to take the functions in software and put them in hardware.'' Meanwhile, the AS/400, which is experiencing strong early sales (see story at left), is emerging as a competitive threat. ``The AS/400 has made built-in relational database a requirement in the mid-range,'' said David Andrews, president of ADM, Inc., an AS/400 consulting firm in Cheshire, Conn. ``As long as RDB remains a layered product outside of the operating system, it will never be as efficient'' as an integrated DBMS, Birch said. Already, DEC has reportedly begun bundling a runtime license of RDB with VMS in France, although DEC would not confirm the reports. The runtime RDB does not allow for development work, only program execution, said Terry Shannon, a DEC analyst at International Data Corp. in Framingham, Mass. ``They could bundle runtime RDB any time. That's just a marketing decision.'' Fully integrating RDB with VMS could be the next step, and a hardware implementation of RDB could follow, Farrell said. Independent software vendors such as Relational Technology, Inc. are anticipating DEC's moves, concentrating on offering tools to work with RDB. Robert Healy, vice-president at Relational , whose Ingres relational database is widely used on VAXs, offered, ``Hardware vendors think that by owning the data and communications, they will have account control. DEC sees IBM doing that and would like to emulate it.'' Same boat Meanwhile, Cullinet Software, Inc. could face the same problem with DEC and RDB as it faced with IBM and DB2. When IBM introduced DB2, it ate heavily into Cullinet database sales. However, Cullinet, like Relational, is stressing a tools strategy featuring its Enterprise line of development software. A Cullinet user _ Ivan Rodriguez, who is assistant director of systems and telecommunications services for Metropolitan Dade County in Florida _ said that although he does not now believe that RDB is superior to Cullinet's Enterprise:DB, if it were improved, he might consider using it. However, he said he will continue to use his Cullinet Enterprise tools. Another DEC user said he would be interested in a combined RDB and VMS, but only if it offered improved performance over other products. ``If it were bundled with the operating system, then we would certainly experiment with it,'' said Timothy Kahn, MIS manager at Sonoco Fibre Drum, Inc. in Marietta, Ga. Given that a future hardware implementation of RDB would offer considerably improved performance over RMS, he said, ``we would definitely look at it.'' By Stanley Gibson, CW staff <<<>>> Title : Networking at DEC VAX sit Author : James Daly Source : CW Comm FileName: trendola Date : Dec 19, 1988 Text: Need any convincing that local-area networks are selling like hotcakes to the mid-range systems market? Just take a quick hand-count at Digital Equipment Corp. VAX user sites, where LAN use has more than quadrupled in the past three years. According to recent surveys by Computer Intelligence, a La Jolla, Calif.-based market research firm, 60% of VAX sites have a LAN installed, with another 11% planning to install one. In the summer of 1985, only 14% of VAX sites could make that claim. ``I think the driving force behind this has been DEC's commitment to LAN technology,'' said Bruce Coughran, a Computer Intelligence researcher. Coughran added that DEC has made a point of emphasizing the use of Ethernet LAN technology to maximize the benefits of its big iron and that its users have apparently taken the advice to heart. Maynard, Mass.-based DEC has also performed an admirable job of selling DEC LANs to its customers. Sixty-three percent of VAX sites opt for a DEC LAN, with the nearest competitor _ Apple Computer, Inc. _ able to muster only a 9% showing. Still, DEC's LAN penetration into the VAX market has slid from what it was two years ago, reflecting the pressure that LAN products such as Apple's Appletalk have put on the market. JAMES DALY <<<>>> Title : Inside lines Author : CW Staff Source : CW Comm FileName: insidewa Date : Dec 19, 1988 Text: Reach out and bite someone. Having recently suffered a disastrous severance of its East Coast fiber-optic cable system, AT&T is making sure the same thing doesn't happen to TAT8, the transatlantic fiber-optic cable that went live last week. The big threat at sea is sharks, which for some unknown reason find fiber-optic cable extraordinarily toothsome. So AT&T had its research and development subsidiary, Bell Laboratories, develop Fishbite-Protection cable, which now protects TAT8 in shark-infested waters. Programmer turned gumshoe. A programmer at Lawrence Livermore National Laboratories recently discovered that a hacker had ransacked files on five computers there; the programmer did a bit of detective work to find the intruder. He reportedly called around and hit on a likely suspect. Problem is, he neglected to alert federal authorities of his findings. A source said that outraged FBI agents spent a couple of days at the esteemed lab, wanting to know why they hadn't been called in on the case. The G-men believe that computer researchers at Livermore may have tried to cut a deal with the hacker to get him to stop the electronic break-ins rather than have him arrested. Ship arrives month early. Although DEC's formal announcement of Decwindows is reportedly slated for Jan. 10, it turns out that DEC is already shipping the product. The much-discussed graphical user interface is offered with Ultrix-32 Version 3.0, which was announced in August and began shipping this month. Greased skids. Amdahl said it would ship its new 6100 storage device by year's end, and it looks like it will meet that deadline. The 6100, which the company said is more of a storage processor than a controller because of the additional intelligence built into the system, is designed to compete with the IBM 3880. Another 6100 model, expected out in late 1989, is designed to compete with the newer IBM storage controller, the 3990. Meanwhile . . . Amdahl competitor National Advanced Systems plans to announce by year's end a fourth-quarter 1989 availability date for IBM Enterprise Systems Architecture (ESA) compatibility. Amdahl announced a similar availability date in September. Both plug-compatible manufacturers are re-engineering several processor cards so that their mainframes can accommodate IBM's new MVS/ESA operating system. OS/2 and you. Microsoft finished its first versions of OS/2 and the Presentation Manager and is now busy on the second, which will eliminate the annoying eight-character limitation on file names. Microsoft is also considering adding some object-oriented features to the Presentation Manager and Microsoft's C language, but no timetable has been announced. Also, IBM has begun demonstrating complex instruction set computing running under OS/2. With this technology, shops can potentially turn OS/2 into a multiuser system aimed at transaction processing. Also on tap is a 32-bit version of OS/2, but our sources indicate that users may have to wait until 1990. We want to be like CA when we grow up. The recently proposed merger of Morino Associates and Duquesne Systems is ``just the beginning,'' according to Mario Marino, chief executive officer of Morino and chairman of the new company. ``Expect to see the new company be very aggressive in growth,'' he told an audience last week at the firm's user conference. Morino acknowledged that he has discussed possible mergers with other firms and had extended an offer to BGS Systems, another IBM mainframe systems software vendor, but was turned down. However, rumors persist of a possible relationship with either BGS or privately held Candle Corp. Have a wonderful holiday and be careful you don't spill champagne on the disk drive at your office party. CW's annual Forecast double issue is next week, so we'll take a breather. We will still be manning the holiday hot line, however. Give News Editor Pete Bartolik a call at 800-343-6474 or, if you're in Massachusetts, 508-879-0700. <<<>>> Title : Oh, yeah? Author : Christopher R. H Source : CW Comm FileName: hertlet Date : Dec 19, 1988 Text: In a recent Computerworld article, you quoted Microsoft Corp. Chairman Bill Gates as saying that ``true multitasking won't work in a 1M-byte system'' [CW, Nov. 21]. How does he explain away a 512K-byte multitasking Amiga 2000? Most of the time, I use the extra memory as a random-access memory disk. Christopher R. Hertel Winnetka, Ill. <<<>>> Title : IBM Netbios-compatible so Author : CW Staff Source : CW Comm FileName: netperfo Date : Dec 19, 1988 Text: IBM Netbios-compatible software that provides application-level compatibility with Novell, Inc.'s Netware has been unveiled by Performance Technology Corp. Powerlan 1.2 permits elective sharing of workstation-connected resources, such as drivers, printers and plotters, and runs applications under DOS, OS/2 and Xenix operating systems. Registered users of Powerlan 1.1 will receive the upgrade version free. Performance Technology, 800 Lincoln Center, San Antonio, Texas 78230. 512-349-2000. <<<>>> Title : Computer industry outlook Author : Nell Margolis Source : CW Comm FileName: helga2 Date : Jan 9, 1989 Text: Industry observers who have had little to say beyond ``Bah. Humbug!'' about the state of the computer sector of late may be suffering the effects of visits from the Ghost of Computer Industry Past. Recent figures compiled and issued by the Computer and Business Equipment Manufacturers Association (CBEMA) show a still-vital industry that, while slowing because of maturity, is nevertheless outstripping the gross national product and is projected to keep doing so. Software and services revenues, already the most robust of four industry segments measured, are expected to continue leading growth through 1989. CBEMA sees the software sector growing 11% to $66.6 billion in 1989, compared with a 15.1% rise in 1987 and an estimated 15.8% increase in 1988. Computer equipment revenues grew 10% last year to reach $118.4 billion and are estimated to grow 11% to $131.4 billion this year. By the end of next year, CBEMA said, an 8% rise to $141.9 billion is expected. Growth in telecommunications equipment revenues slowed from 1987's 4.1% to approximately 3% in 1988; revenues are expected to increase by some 4% in 1989 to $57.5 billion. Telecommunications services growth, already down from 7.6% in 1987 to an estimated 6.9% in 1988, should decrease further to 5.4% in the coming year. That diminished percentage growth, however, will nevertheless represent a $149.1 billion industry niche. NELL MARGOLIS <<<>>> Title : Terry R. Lautenbach Senio Author : CW Staff Source : CW Comm FileName: lauttit Date : Jan 9, 1989 Text: Terry R. Lautenbach Senior Vice-President and General Manager IBM U.S. <<<>>> Title : 'Twas the season of givin Author : CW Staff Source : CW Comm FileName: liner02 Date : Jan 9, 1989 Text: 'Twas the season of giving and all through the yard Not a vendor was stirring, nor even this bard `Up, up, up!' the editor did cry `Our readers await; our trade we must ply Rich gossip we solemnly promise, by golly And right now, I promise, I ain't feeling jolly' `Please sir,' one writer did say with a bow `It's Christmas, O master, we want to leave now' `Nay, read what my lips say,' the boss-man evoked `Those glowing reviews can be quickly revoked `On Barney, on Daly, on Cortese and Ryan Get back to your stations! Let's see fingers flyin'! Go Hamilton, Brandel, go Pitta and Keefe Go after that story like hounds with bared teeth! `Yo, Wilder, Horwitt, yo, Savage, Margolis You can do much more, and I really know this Take Connolly, Alexander and, yes, Mr. Gibson To Armonk and Maynard your pens should be blitzin' `Call in Bozman, call Betts and, hey, call in Moran It's high time to show the whole world that we can' Just then, just that moment, as the troops did assemble A senior staff member did stand up and tremble `If we're desperate, the cat should be freed from the bag Let's preview for readers our January rag You know on Jan. 9 we will print words from Akers Let's give a taste now just to show we're not fakers `The IBM chair through an hour's talk did last Examining the ghosts of IBM's future, present, past Explaining the whys of the corporate vivisection John Akers looks down on a Blue resurrection `It takes that cold shower . . .'' John A. said with knowing `to get your blood going . . . and our blood is going' If from IBM's pinnacle you'd like to hear Grab our first reg'lar issue of the upcoming year Another voice cried, `Hey, there, boss _ stop the presses! A Microsoft rumble I've heard that perplexes In OS/2 Extended, the source code, it seems Has suddenly in Armonk stirred up some bad dreams `Microsoft needs it for true compatibility But IBM has trouble assessing reliability In the wrong hands, it's said, IBM fears this code Would put upstart challengers 'pon the right road' `We're hot now; we're humming,' the boss said with glee `I think tunnel's end now I'm starting to see What else, now, what else,' he remarked as he tore Through the now-bustling newsroom, out looking for gore Hit by a bolt, our benchmark whiz smiled `Hey, boss, my quest's slip'ry as treasure of the Nile From DEC since July an audited report we've awaited And here still I sit, boss, completely unsatiated `And now IBM with its benchmarks will show that the 4381 and 9370 really do glow They redid the network to rebut TPS saviors And lo and behold _ performance barely wavers' Then, like a happy Santa, the editor did sit back No more inches to fill _ finally! gifts he could wrap `Let's call it a wrap, now, and clear out of here Let's rest and get ready for a happy new year' To his readers he offers a parting thought: We hope you could use what this past year we've wrought We've done our job well, but still, we're not happy We'll try to do more _ and we hope this ain't sappy Our hot line (800-343-6474 or 508-879-0700) you can call and News Editor Pete Bartolik's staff will carry the ball <<<>>> Title : The industry year in list Author : Clinton Wilder Source : CW Comm FileName: cicol2 Date : Jan 9, 1989 Text: Nineteen eighty-eight. Leap year, Olympic year, election year. It's all over now, except for the shouting _ and the list-making. As we do every year, we'll skip the 10 Best Movies and 10 Worst-Dressed Women and Men compendiums and concentrate on what we humbly purport to know best. The 10 Biggest Stories of 1988: 1. The standards wars. The Open Software Foundation vs. Unix International _ nee Archer group _ and the Extended Industry Standard Architecture vs. the Micro Channel. 2. The IBM Application System/400 _ nee Silverlake. 3. Computer Associates reaching the $1 billion revenue plateau by acquiring Applied Data Research _ and CA's rebuffed attempt to swallow Management Science America (MSA). 4. The saga of Prime Computer _ from the hunter of Computervision and Calma to the hunted of MAI Basic Four. In a year of unwanted suitors such as CA, Daisy Systems and Telxon, MAI's Prime pursuit proved the most significant. 5. Apple's ``look-and-feel'' suit against Microsoft and Hewlett-Packard. 6. The struggle of mini makers such as Data General and Prime against the accelerating trend of MIS buyers to powerful PCs and local-area networks. 7. The creation of IBM United States under Terry Lautenbach. IBM's flattening of its organization and pushing of decision making out to its business units mirrors the organizational trend being practiced by many of its largest customers. 8. The dynamic random-access memory shortage. 9. The woes of Lotus _ missed shipping dates, executive departures and a falling stock price. 10. The minisupercomputer industry shakeout. 11. (Last-minute addition.) The merger of Morino Associates and Duquesne Systems. Comebacks in 1988: John Cullinane, Cullinet. Gene Amdahl, Andor. H. Ross Perot, Perot Systems. Steve Jobs, Next. Bill McGowan, MCI Communications. Ed Cherney, Encore International. Bill Poduska, Stellar. Allen Michels, Ardent. Ken Oshman, Echelon. Moving up in 1988: Terry Lautenbach, IBM. George Conrades (again), IBM. James Cannavino, IBM. Frank King, Lotus. Bob Weiler and John Landry, Cullinet. Allen Loren, Apple. Anthony Craig, Prime. Curtis Hessler, James Unruh, Unisys. Larry Perlman, CDC. Dennis Vohs, Ross Systems. Robert Allen, Robert Kavner, AT&T. Gil Williamson, NCR. Moving out in 1988: Bill Lowe, IBM. Joe Henson, Prime. Jerome Meyer, Honeywell Bull. Allen Krowe, IBM. Kaspar Cassani, IBM. David Chapman and George Tamke, Cullinet. Bill Graves, MSA. The Lotus departures _ Chuck Digate, Irfan Salim, Mike Kolowich, Palmer True, and John Shagoury, among others. Roy Folk, Ashton-Tate. Peter Appleton-Jones, Elxsi. Norbert Berg, CDC. Thomas Roberts, CDC. Jan Lindelow, Unisys. Del Yocam, Apple. Debi Coleman, Apple. Stu Miller, Lynn Pearce, Software AG. Casualties in 1988: Saxpy Computer. Celerity Computing. Cydrome. Scientific Computer Systems. Ramtek. Rise Technology. By Clinton Wilder; Wilder is Computerworld's senior editor, computer industry. <<<>>> Title : Postal Service cries uncl Author : CW Staff Source : CW Comm FileName: postal2 Date : Jan 9, 1989 Text: WASHINGTON, D.C. _ The U.S. Postal Service has given up on its controversial computer services deal with Perot Systems Corp. because of legal protests filed by archrival Electronic Data Systems Corp. (EDS). Postmaster General Anthony M. Frank announced Dec. 16 that the Postal Service and Perot have agreed to terminate the contract, albeit reluctantly. In a statement, Frank said, ``The contract has been tied up so badly by protests and lawsuits that we have lost much of the original opportunity. ``The studies we had hoped to complete last August cannot now be done before next February and could well be further delayed by active litigation,'' namely ongoing lawsuits between Perot and EDS, Frank said. A recent court order [CW, Oct. 24] sought by EDS limits Perot Systems to nonprofit work through Dec. 1, 1989, and effectively killed the deal, the postmaster general indicated. EDS and Perot are involved in a bitter feud dating back to December 1986, when EDS founder H. Ross Perot left EDS and its parent General Motors Corp. EDS argues that Perot is violating a severance agreement stating that he would not compete with EDS for three years. Perot Systems, based in Vienna, Va., issued a statement that supported Frank's decision, adding that ``efforts to make major changes in a large organization inevitably energize forces opposed to change.'' Splashy reentry The Postal Service contract was signed in May as part of Perot's splashy reentry into the computer services market. The two-part contract allowed Perot's new company to study methods of improving postal operations and then implement the ideas under a shared-savings contract. But the unusual arrangement triggered howls of protest from competitors and members of Congress because it was not open to competitive bidding and seemed to lock Perot Systems into a 10-year sweetheart deal with the agency. EDS and McLean, Va.-based computer services firm Planning Research Corp. each filed official protests with the General Services Administration's Board of Contract Appeals. Not revenge The EDS protest came after several top EDS executives defected to work for Perot Systems, but EDS denied that its protest was sparked by any revenge motives. Jack Biddle, president of the Computer & Communications Industry Association, praised the decision to scrap the contract. He said the action preserves the government's policy of open competition and avoids the financial conflict of interest that occurs when a company involved in ``problem identification'' also provides the hardware and software solutions. By Mitch Betts, CW staff <<<>>> Title : European unity to alter P Author : CW Staff Source : CW Comm FileName: ec2 Date : Jan 9, 1989 Text: PARIS _ The European Community's plans to abolish internal trade barriers in 1992 are forcing personal computer suppliers to modify their production and distribution strategies. ``Europe is still a very fragmented marketplace,'' said Brigitte Morel, managing director of Dataquest/Intelligent Electronics. ``But we all know that in the next 10 years, we will have to sell to this market in a very unified fashion.'' In that time, the Western European market will undergo sweeping deregulation of export laws, customs oversight and even currency differences among nations. The changes pose vast challenges for all companies selling goods and services in Europe but may prove particularly vexing for vendors in the fast-growing but intensely competitive personal computer market. Morel and other speakers at a Dataquest/Intelligent Electronics conference here last month stressed that PC manufacturers must develop distribution channels that can irrigate all of Europe and offer greater added value _ in the form of services, training and support _ to professional users. PC suppliers are also being forced to reexamine their production strategies in light of the European Community plans, lest they find themselves locked out of what will become the world's largest PC market, according to Dataquest/Intelligent Electronics. Many already have begun shifting manufacturing to local plants. `Attic assembly' More than 50% of the PCs shipped in Europe during the past two years were made in Europe, according to Dataquest/Intelligent Electronics researcher Kees Dobbelaar. Most imports came from Southeast Asia, not from the U.S. or Japan. Even these imports are starting to be challenged by what Dobbelaar called ``attic assembly,'' by small local firms importing components from Asia. IBM accounted for 31.3% of the production volume in Europe in 1988, according to Dataquest/ Intelligent Electronic's preliminary figures. It was followed by Ing C. Olivetti & Co. at 15.6%, which manufactures its PCs near corporate headquarters in Ivrea, Italy; Apple Computer, Inc. at 9.6%, which has a facility in Cork, Ireland; and Hewlett-Packard Co. at 5.5%, which has doubled production this year at its plant in Grenoble, France. Compaq Computer Corp., which this year accounted for only 3.7% of European production, recently built a plant in Scotland. Shoring up local identity Other non-European firms are expected to reinforce their European manufacturing presence in order to shore up their local identity and avoid the potential protectionist fallout of the European Community plans. With user demands becoming more sophisticated, it is increasingly difficult for PC dealers to maintain profitability, said Michel Aguerreberry, managing director of Agena SA, a major French distributor. He predicted the European dealer network will grow only 5% annually during the next five years. ``The expected market growth will lead to a bottleneck,'' he said, ``if manufacturers don't change their attitudes toward distribution channels.'' He said this means manufacturers must reduce their reliance on direct sales and form closer partnerships with large distributors and value-added resellers. Paul Helminger, vice-chairman of Computerland Europe S.A., agreed. ``The best partnership strategy is the one which adapts best to the variety of local market situations while at the same time taking advantage of the [overall] European dimension business is moving into,'' he said. By Amiel Kornel, IDG News Service <<<>>> Title : Paradyne woes end with $2 Author : CW Staff Source : CW Comm FileName: para2 Date : Jan 9, 1989 Text: Paradyne Corp.'s long and rocky road back from legal scandal and financial disaster took a sharp turn last week when AT&T agreed to acquire the beleaguered data communications equipment maker for approximately $250 million, or $10.25 per share. Analysts, who expected a bid for Paradyne _ a frequent entry on lists of probable acquisition targets _ but voiced surprise that the bidder turned out to be AT&T, viewed Paradyne's sales and service strength as fueling the offer. ``Paradyne has a fairly sizable sales force, and their distribution channel and service organization are considered quite good,'' said Theodore Moreau, an analyst at Robert W. Baird & Co. in Milwaukee. Services, said Moreau, accounted for approximately 20% of Paradyne's 1987 revenue of $233 million. Paradyne's ``international distribution, sales and service are of great interest to us,'' an AT&T spokeswoman confirmed. Paradyne's large installed base and Largo, Fla., plant also received mention as attractive to AT&T. ``I think that [AT&T is] desperate to improve its market reach into data networking and realistic about its ability to do it through internal development,'' said Maria Lewis, an analyst at Shearson Lehman Hutton, Inc. Lewis dismissed as public relations spin AT&T's contention that the Paradyne acquisition will also bring AT&T ``some manufacturing and development expertise that they don't already have.'' However, she and others expect the communications giant to employ a balance rather than a hatchet when it comes to evaluating the companies' respective product lines. Derailed in '85 Given Paradyne's history, said Andrew Schopick, an analyst at Gartner Group, Inc. subsidiary Soundview Financial Group, the AT&T offer is a coup for the Florida-based company. Once regarded as a promising communications contender, Paradyne was derailed by a 1985 indictment for fraud in connection with a purportedly rigged government systems bid. The organization's 1987 settlement with the U.S. Department of Justice dismissed all charges other than conspiracy, to which the company pleaded guilty; however, four years of harrowing legal combat depleted both its morale and financial resources _ legal bills alone were estimated at approximately $18 million, according to a company spokesman. Moreover, further legal woes and the need to triage outdated technology and unprofitable product lines mired Paradyne in further staggering expenses. Under the terms of the proposed acquisition, AT&T will hold Paradyne as a subsidiary; the company will remain at its Largo site, under the direction of current Chief Executive Officer John Mitcham, who was brought in from IBM last summer to help turn Paradyne around. By Nell Margolis, CW staff <<<>>> Title : network management is a v Author : CW Staff Source : CW Comm FileName: jtkforca Date : Jan 9, 1989 Text: network management is a veritable sea of choices bobbing with different network platforms, all festooned with ``open interface'' banners. According to an April survey by Forrester Research, Inc. in Cambridge, Mass., users are targeting their dollars at IBM's Netview, AT&T's Unified Network Management Architecture, OSI and Digital Equipment Corp.'s Enterprise Management Architecture (EMA). Even so, the market is so fragmented _ and clearly still up for grabs _ that the leading segment in the survey was the ``other'' category, which garnered the largest purchasing share, 37%. While they wait for vendors to flesh out their strategies and support for multiple environments, many users are taking a deep breath, crossing their fingers and plunging forward into homegrown solutions. One approach calls for selecting a few areas to monitor _ for example, session management, physical-layer management and packet-switch nodes. A mix of network management systems usually does the job, but more often than not, the systems are completely separate. ``From a network control perspective, we basically are going for the best equipment that we can find for the particular function, but it has to interoperate with what we've got,'' Harris says. His IBM Systems Network Architecture network is standardizing on IBM's Netview, with a Datapoint Corp. server functioning as the gateway to all non-SNA systems. ``We've made some pretty cavalier decisions in terms of moving forward,'' he points out. ``I'm afraid that one of these days it will turn out that one of our primary [business partners] has a whole office filled with Wang or something, and I'll go `Oops!' When that happens, I'll have to get real smart, real fast.'' About 50% of the IDC survey respondents said they currently use from one to three network management systems. Roughly 30% use between four and six systems; another 2% use more than six; and about 10% use nothing at all. Interconnection futures Many users answer the challenges of interconnection or interoperability by resorting to reentering a printout from one system on another system's console. In lieu of concrete offerings from vendors, users are adding some elegance to the ``swivel-chair'' approach, says Marvin Chartoff, an analyst with Ernst & Whinney in Fairfax, Va. In other words, move all the consoles for the different systems together, allow them to dump data into the same printer and start gearing up for automated operations. A number of vendors are taking steps to provide for interconnection, but again, they're talking futures. For example, Hewlett-Packard Co. and AT&T are publishing the specifications to their management architectures. IBM released the specs for Netview/PC to third parties, and OSI cheerleader DEC indicated it may provide specs to its recently announced EMA sometime next year. All these companies have promised to migrate their systems to the OSI Network Management standard _ once it becomes finalized, which may be as much as three to four years away _ or to at least support the OSI communications interface. Yet even OSI provides no guarantee of interoperability. As users of CCITT X.400 gateways are finding to their increasing dismay, just because two products comply with the standard doesn't mean they can talk to each other. It also means users run the risk of getting locked into proprietary implementations of these standards. The solo approachThe alternative approach is to go with a single-vendor solution. ``I believe the one-vendor philosophy is a little easier to coordinate,'' Affiliated Bank Services' Robeck says. But this approach is not always the most realistic one. ``It's true that staying with one homogeneous approach is easier on development overhead and system administration, but given the real world, in most installations there is a real need for more interoperability,'' says Clare Fleig, director of research at International Technology Group. She sees the single-vendor approach as more suited for larger, terminal-based systems. Users in this camp risk potential lock-in and commitment of the future of their network to one vendor, Chartoff says. ``You can also eventually restrict the types of applications that you support,'' he adds. Actually, most network management systems combine vendor-supplied network hardware and a mix of third-party packages and internally developed applications and systems. For example, the New York-based Financial Industry Standards Organization (FISO), a consortium of some 20 financial services companies that are trying to develop a common communications standard for their industry, is planning to use a mix of OSI and proprietary networking solutions. But some users such as Cyanamid's Kascik view standards with considerable suspicion. ``I'm not waiting for OSI,'' he maintains. ``Standards would be nice, but vendors won't allow real good integration unless you are willing to write hooks into their systems.'' Regardless of which way a user turns, it is imperative that they start moving now, Frank says, warning, ``If you don't start today, you won't have a prayer. The average 3090 generates 1,000 messages per second. The average network control operator can read one per second and act on one every two seconds. Consider that CPU activity is supposed to go up 12% a year. So if you think things are out of hand now, in a couple of years it will be worse.'' Users can begin to lay the foundation for network management even if they are undecided about which direction to take, Gartner's Frank says. ``You have to realize that all network management decisions today are tactical _ the strategic stuff is five years out.'' He advocates organizing ``tedious, low-level, time-consuming tasks,'' such as the following: Combining databases, or at least making multiple databases appear as one entity. Starting now to build a pool of network management specialists and cross-training at every opportunity. Knowing what you have, where it is and what you need. Moreover, users should determine what they really need to manage their networks. ``Vendors can only provide a reasonable solution to the extent that they get reasonably stated problems,'' notes Thomas Nolle, president of Haddonfield, N.J.-based CIMI Corp. Regardless of the path that they take, users need to make certain that their chosen management strategy will help them control their networks rather than give vendors a way to control the account. Advises Nolle: ``Select the network management system that seems like a fit to your problems, even if it's not integrated. It's much better for you than one that is beautifully integrated but does not meet your needs.'' <<<>>> Title : The year of alliances mak Author : Douglas Barney Source : CW Comm FileName: forcast Date : Jan 9, 1989 Text: Vendors are learning to play by a new set of rules. No longer is it acceptable for them to lock users into a totally proprietary system _ it just won't sell. Instead, both hardware and software vendors are having to change the way they operate in order to meet user demands for the best overall solution and to allow incompatible architectures to work together. The biggest shift in vendor operations has been the rise of so-called strategic agreements that hopefully turn into strategic relationships _ which hopefully create strategic products. Some strategic alliances work. Others fall apart because of the squabbling and political infighting that is the very nature of competition. What follows is a look at some of the most important strategic alliances, most of which have been announced within the last year or two. All of these fall into categories: It Worked, It Failed or Yet to Be Determined (YTBD). YTBD: DEC and . . . DEC and Tandy. Earlier this year, Digital Equipment Corp. conceded defeat in selling its own PCs. It tried the Rainbow, which failed because it was incompatible with the standard IBM established. Then it tried with the Vaxmate, a network-oriented PC that failed because of its high price, lack of color and serious problems with overheating, all of which were well documented by the press. DEC is preparing to take a fresh crack at this market by selling IBM PC-compatible machines custom-built by Tandy Corp. Details of the products will not be announced until early next year, and only based on those details can users and analysts gauge DEC's chances for success. But if the products and the prices are right, and if DEC salespeople are effective and add value such as advanced networking, DEC may finally succeed in the PC market. But if any of these conditions are not met, DEC will fail again. Analysts say they believe that DEC may have finally hit on the right approach and that the company will work hard to make sure it succeeds. According to John McCarthy, director of professional systems research at Cambridge, Mass.-based Forrester Research, Inc., DEC will finally succeed simply because it will put in the necessary effort. DEC and Compaq. Prior to the DEC/Tandy announcement, the minicomputer giant copped a deal with Compaq Computer Corp. This deal is much less ambitious than the Tandy arrangement, essentially calling for cooperation to ensure that Compaq systems work effectively on DEC networks. If the companies work well together, Compaq machines will be highly compatible on DEC networks. Knowing Compaq, even if the two do not work well together, Compaq machines will be highly compatible on DEC networks. The deal is viewed as a bit of a no-brainer, with little chance of failure. DEC and Apple. DEC has been cranking out strategic alliance after strategic alliance, a real break with its proprietary tradition. One that caught a lot of attention was its alliance with Apple Computer, Inc. The alliance was a recognition that VAX shops were being increasingly filled with Apple Macintosh computers. Because few of these shops were buying Vaxmates, it was no loss for DEC to endorse, support and help sell Macs. In terms of lifting the visibility and respectability of the Macintosh, the agreement has been wildly successful, analysts say. But there is another, co-development aspect that has yet to be proven. Apple and DEC are supposed to be developing tool kits that will provide tighter integration between Macs and VAXs. Until these are fully fleshed out, available and proven, the DEC/Apple alliance is YTBD. Whether it succeeds or not, its importance has clearly been diminished, because DEC will soon be pushing Tandy machines, analysts argue. Others YTBD Sun and AT&T. This alliance has resulted in no product shipments, but the very announcement shook the foundations of the entire computer industry. Earlier this year, AT&T joined forces with workstation upstart Sun Microsystems, Inc. to jointly develop a version of Unix that will come complete with a graphical user interface called Open Look. The agreement was supposed to create, once and for all, a single Unix standard. Instead, it splintered the industry into hostile opposing camps. AT&T, however, has been mustering its forces and now boasts support from Unisys Corp., NCR Corp., Amdahl Corp., NEC Information Systems, Ing. C. Olivetti & Co. and Toshiba America, Inc. This group, now dubbed Unix International, is gaining steam and may well steamroll the opposition. OSF. The Open Software Foundation was created to blunt the effect of the AT&T/Sun alliance. The OSF draws together firms that compete against each other with venom and aims to develop an open, licensable version of Unix. The key players include DEC, IBM, Apollo Computer, Inc. and Hewlett-Packard Co. _ but, so far, no AT&T or Sun. There is still disagreement as to whether the OSF will ultimately succeed. ``One or two key defections and it's history,'' Forrester Research's McCarthy maintains. Other analysts, such as International Technology Group's Clare Fleig, say they believe the sheer weight of the players will ensure success. Microsoft and IBM: So far so good. Microsoft Corp. Chairman Bill Gates must thank his lucky stars that IBM chose MS-DOS as the operating system for its original IBM Personal Computer. Gates parlayed that deal into a billion dollars worth of personal wealth and a company that sits at the very top of the PC software heap. Gates has struggled to keep wily IBM on his side. He fought to get Armonk to support OS/2. It did. But the bigger challenge came on the interface side. Gates, a graphical user interface maven if ever there was one, battled tooth and nail to get IBM to implement Microsoft Windows as the user interface for OS/2. IBM agreed, with one twist. Windows would have to be rebuilt, and IBM mainframe-oriented graphics technology would have to be added. In addition, Windows was not only changed, but renamed Presentation Manager. Things are still looking relatively good. IBM has agreed to support a 32-bit version of OS/2, which is likely to keep the relationship strong for several more years. But some cracks are beginning to emerge. IBM is licensing a user interface from Next, Inc. that, by its very definition, will compete with Presentation Manager. ``Without Microsoft, IBM would just be a ship in a storm without an anchor,'' McCarthy says; he argues that the alliance will remain strong. Ashton-Tate, Microsoft and Sybase: YTBD. Critics were immediately skeptical. How could two software companies that were making clear moves into each other's markets possibly agree on anything? they asked. Well, they did agree to co-market some multiuser database software called SQL Server, and Ashton-Tate, Inc. did agree to design Dbase IV to act as a front end. But SQL Server ship dates have slipped, and Microsoft has reportedly been displeased with Ashton-Tate's progress on Dbase IV 1.1, the front end to SQL Server. The whole deal almost fell apart after a well-publicized spat between Bill Gates and Ashton-Tate Chairman Ed Esber. Gates' temper flared after he heard that Ashton-Tate planned to distribute SQL Server through Novell, Inc. Novell is Microsoft's key rival in the local-area network operating systems market, and letting such a rival resell a Microsoft product simply would not do. Gates forced Esber to call off a press conference and cancel the agreement with Novell. Still, the organizations insist that all is well and that the relationship between Microsoft and Ashton-Tate is strong. But other cracks are starting to appear. Ashton-Tate has licensed another multiuser engine from Interbase and has pledged to support OS/2 Extended Edition, which by definition competes with SQL Server. On the other hand, Microsoft is busy readying its own end user-oriented PC DBMS that will work with SQL Server. This, by definition, competes with Ashton-Tate's Dbase. Can these two partners/competitors hang together? Only time will tell. For now, their relationship is rated YTBD, and the product is rated Yet to Be Delivered. IBM/Stratus: YTBD. This agreement is one of a kind for IBM. The deal that calls for IBM to resell Stratus Computer, Inc. fault-tolerant computers _ renamed the IBM System/88 _ is a radical departure for the mainframe monolith, which develops most of its own systems. But that departure has gotten IBM into a lucrative and growing fault-tolerant market and has made up some 25% of Stratus' revenue, providing much-needed cash for development and growth. Analysts say the deal is working out well and point to IBM's own scaled-back fault-tolerant development efforts as proof. So far so good. EISA: YTBD, but looking rough. At first blush, many analysts thought it looked great: Compaq and eight other PC clone vendors rallied together to develop an alternative to IBM's Micro Channel Architecture bus. These nine firms that make up the Extended Industry Standard Architecture (EISA) pledged to work together on a 32-bit bus that would be available in late 1989. But savvy observers and users spotted the chink in the armor right away. Some questioned how competing vendors could work together, particularly when there was no defined hierarchy for decision-making. These same critics pointed to a 1986 effort at bus standardization spearheaded by Phoenix Technologies Ltd. that fell apart due to the squabbling of the competing firms involved. Bottom line for EISA is Yet to Be Built. Because so many EISA members are developing Micro Channel products, many analysts now say they believe that the EISA bus will never be built. Strategic alliances provide a host of benefits for vendors. For some, it is access to technology and new markets. For others, it satisfies the need for publicity. Users also have a divided lot. Some alliances provide users with products that are impossible to create any other way. In some cases, the fruit of alliances is the bridging together of incompatible systems. But far too often, alliances leave users with nothing but disappointment and thoughts of what might have been. By Douglas Barney; Barney is a Computerworld senior editor, microcomputing. <<<>>> Title : concerning denormalizatio Author : CW Staff Source : CW Comm FileName: jsftlin0 Date : Jan 9, 1989 Text: concerning denormalization must be made with care. Anyone who understands the data can create a normalized design; denormalizing data also requires an understanding of how it is used. Relational database applications can perform well, but with today's technology, this can happen only when the number of blocks of data that any transaction accesses is controlled. This control may occur naturally with the process the transaction must perform _ for example, display a specific record on the user's screen _ or may occur in the table design, as when all tables are sufficiently small. Otherwise, the control must be artificially built into the application program or the table design. Quite often in the mainframe environment, a normalized design does not perform well, even when the cache memory or buffer pool is increased or the speed of the processor is increased. The only choice then is to denormalize the data somewhere. As Lapid made clear, denormalization decisions usually involve trade-offs between flexibility and performance. Such decisions are not always made easily. In fact, denormalization is really an extension of the art of normalization. Normalization requires an understanding of the data, but intelligent denormalization also requires an understanding of the flexibility requirements of the data; awareness of the update frequency of the data; and knowledge of how the database management system, the operating system and the hardware might work together to deliver optimal performance. The simple truth is that denormalization is an economical and technological necessity; it must be seriously studied. A well-performing database cannot be designed strictly within the confines of relational theory. It can only be accomplished with an awareness of the total environment in which it must operate. <<<>>> Title : 1988 year for bargain sho Author : CW Staff Source : CW Comm FileName: market26 Date : Jan 9, 1989 Text: During 1988, used equipment prices experienced a dramatic decline for a number of computer products. For the owners of equipment _ both data processing users and equity investors _ this drop may require a write-down of their portfolios. However, for purchasers of used equipment, the marketplace presents particularly attractive values. IBM equipment ranging from 4381 processors to 3380 disk drives to 3800 laser printers fell victim to these declines. Retail prices for the 4381 Model P13, for example, fell from January prices of 58% to November prices of 32% of IBM list price. Bargain hunters will also find savings in the storage marketplace. In January, a head-of-string 3380 AA4 with one 3380 B4 was trading retail for about $31,100. By November, the retail price for the same equipment was only $10,700. This represents a decline of 66%. IBM 3800-3 laser printer prices also fell 32% in that time. Even alternative vendors' equipment was not spared from suffering through substantial declines in used prices. The Wang Laboratories, Inc. VA-65 processor dropped from 57% of list price in the beginning of the year to current prices of 30%, while the Digital Equipment Corp. VAX 8600 fell from 70% to 43% over the same time span. Some of these declines can be attributed to the normal life cycle of computer products and the impact of replacement technology, while other declines are more difficult to explain. The drops in the IBM direct-access storage device (DASD) products mentioned above can be accounted for by the shipment of higher capacity 3380 K drives, falling prices of older 3380 D and E drives and increased plug-compatible manufacturer competition. On the other hand, the 4381's demise was caused more by IBM's upgrade pricing and a general lack of end-user demand.Predictions What can we expect in 1989? IDC Financial Services Corp. said that IBM will be making major product announcements in both DASD and medium- and small-scale 370 processor families. In the first half of 1989, IBM is expected to announce the replacement for the 3380, called the 3390; the replacement for the aging 4381 family of processors, dubbed the 4391; and additional models of the Enterprise System/9370. DEC, Wang and several other computer manufacturers are also expected to make significant announcements in 1989. As these new products displace the prior generation, supply of used equipment will increase. This increased supply will, in turn, create downward pressure on secondary market prices. What posture should MIS directors take under these circumstances? Purchasing new equipment from the manufacturer without regard to secondary market activities and potential future announcements can deplete corporate funds unnecessarily. As a result, MIS departments may be left without the capital to purchase or lease additional equipment when significant new announcements are made. Given the bargains on the secondary market and the projected announcements of replacement products, it is difficult to understand why anyone would purchase new equipment from the vendor at list price. Astute MIS directors should consider the purchase of used equipment or short-term leases to fill interim needs and conserve funds for purchase or long-term lease of future products. For MIS directors with limited budgets, the new year will present a significant opportunity to bring their data centers up to almost leading-edge technology at very reasonable expenditure levels. Those directors may want to consider IBM 3090 processor base models, Amdahl 5890 processors and National Advanced Systems XL processors for their high-end needs, all of which are projected to trade for less than 50% of list price by the end of 1989. Significant opportunities will also exist for DASD and cartridge tape and a whole range of other computer products. For more information, contact IDC Financial Services' Terri LeBlanc at 508-872-8200. By Thomas J. Donovan, IDC Financial Services Corp. <<<>>> Title : Klooge! Author : Glenn Rifkin Source : CW Comm FileName: forefun Date : Jan 9, 1989 Text: Ebenezer Klooge lay in his majestic bed, tossing and turning as the wind howled outside his window. It had not been a good day at the shop. The year was winding down; in fact, it was Christmas Eve, and Klooge's staff had made not a dent in the massive backlog that they had started with last January. Indeed, the addition of the new whiz-kid applications specialist, Bob Batchit, had not produced the expected benefits at all. If anything, the backlog was worse, though Klooge had pushed Batchit without mercy. The pathetic propeller-head might still be at his terminal for all Klooge knew. So what if it was Christmas Eve? Bah, humbug on all this sentimentality, Klooge thought. The CEO was even meaner than he was. ``COMPETITIVE ADVANTAGE, KLOOGE!!! NOW THAT'S WHAT THIS IS ALL ABOUT, MY GOOD MAN.'' A productivity increase is really what it's all about, Klooge sneered _ a productivity increase, nuts to eggnog. Oh, how he loathed spineless Batchit, who had crept up to Klooge's desk that very afternoon. ``Uh, pardon me, Mr. Klooge, sir. But may I have a word with you, sir?'' ``What do you want, Batchit?'' Klooge had barked. ``Well, sir, I was wondering, uh, I mean, I was hoping to . . . uh, leave a bit early today. You see, it is Christmas Eve, after all, and my family has a goose cooked, and my little son Micro Tim is ill and could hardly play Nintendo this . . .'' ``BAH, HUMBUG!'' Klooge had screamed. ``Back to your workstation or your goose will be cooked. You're already getting the whole day off tomorrow, which means bringing the system down tonight. Do you realize that we are three months behind schedule on applications requests? That my stat mux is on the fritz, and the CEO wants to know what we're doing about DB2? And they canceled NCC, for crying out loud, so I didn't even get to Vegas this year! Back to your desk, you sniveling bit-sniffer, or you can call Robert Half next week.'' And off slouched the stooped, sad-eyed Batchit. Klooge looked at his multifunction 386-chip wrist computer and frowned. ``Bah, it's 5 o'clock and all these geeks will want to leave just because it's Christmas Eve,'' he had snarled. ``Well, I'll get another good hour out of them, anyway.'' With that, he had turned toward the staff and bellowed, ``CODE, CODE, CODE, GIVE ME COOOOOODE!!'' Now, as he tossed in his bed, Klooge pulled his nightcap down snugly around his ears. ``Damn drafty in here,'' he muttered as his eyelids drooped. Moments later, the sash to his bedroom window flew open, and the curtains unfurled in a whoosh of cold night air. Flakes of snow swirled into the room as Klooge leapt upright in bed, his eyes wide with fright. ``What is it? Who's there?'' he screamed. Into Klooge's room jumped Charles Wang. ``Frank Dodge, I want your company!'' he yelled. ``WHAAAT?'' Klooge yelled back. ``Isn't this McCormack & Dodge?'' Wang asked as he floated above Klooge's bed. ``No!'' Klooge replied. ``Ooops, my mistake. Wrong morality play,'' Wang apologized as he leapt out the window into the night. Klooge huddled under the bedclothes and shivered. ``I have to cut out the pizza with onions before bed,'' he said to himself. Fitfully, Klooge drifted off to sleep. The electronic grandfather clock beeped three times in the pitch dark. Suddenly, the window flew open again; wind and snow swirled into the room. Klooge hardly opened an eye. ``Where's my Maalox?'' he said, groping toward his nightstand. A huge, ghost-like figure floated in on the wind. The figure wore a white smock and had a plastic pen protector in his pocket. His hair was disheveled, and he wore thick, dark glasses. ``EBENEZER KLOOGE!'' the figure bellowed. Klooge sat bolt upright in bed. ``What is this _ Grand Central Station? Hey, don't I know you?'' he demanded of the apparition, with more than a tinge of fear in his voice. ``What a dummy. Sure you know me. I'm your old DP manager, Jack Nerdley.'' ``You look awful, Jack.'' ``I'm a ghost. I've been dead for seven years. If you were doomed to wander from data center to data center for eternity with spaghetti code and batch cards tied to your ankles, you wouldn't look so hot, either.'' ``Good point, but you didn't look so great when you were alive.'' ``Just my luck, I get to haunt a wise guy. Listen, buster, I got news for you. You're next, Mister Hotshot CIO. Took my desk and kept yelling `CENTRALIZE, CENTRALIZE!' Well, you're in for a long night here.'' ``What do you mean?'' Klooge asked, his eyes widening. ``You'll see. The big guy is sending three spirits down here to see if they can straighten you out. The word is that you are giving the information age a bad name. Keep it up and you'll be schlepping around with me for the next billion years.'' ``Right, and I'm sure Tom Watson Sr. will drop in for tea tomorrow night,'' Klooge said. ``OOWWWWWWW,'' Nerdley howled. ``Boy, are you gonna get it. They'll probably chain an old 1401 to your ankle and make you lug that around for eternity. Anyway, I'm outta here. Just thought I'd warn you. Good luck; you'll need it.'' With that, Nerdley backed slowly out of the room and through the wall. Klooge lay back under his blankets and shivered. ``This is giving me a headache,'' he moaned and then promptly fell asleep. Inexplicably, the beeper on his digital watch signaled 1 a.m., and Klooge opened a bleary eye. At the foot of his bed was a small figure feverishly counting a large stack of punch cards. It appeared to be sitting but was actually floating transparently above the bed. ``Let me guess,'' said Klooge. ``Ghost Number One.'' ``You hearda me?'' the ghost replied, surprised. ``I thought I'd surprise you, get a few screams and squeals. Just for laughs.'' ``I knew I should have spent Christmas in Florida,'' Klooge sighed. ``I AM THE GHOST OF SYSTEMS PAST!'' the apparition replied. ``I've come to take you back, back, back to your own past. Come, there's no time to waste. The time machine is double-parked.'' ``Get outta here,'' yelled Klooge. ``Johnson, right? You're the wise guy from telecom. Nice outfit, you clown. Now beat it.'' ``KLOOGE, COME WITH ME!!!'' the ghost commanded. And with that, Klooge rose involuntarily from the bed, his nightshirt fluttering in the wind. ``WHOAAAA,'' he wailed as he whooshed out the window close on the heels of the ghost. Huge sand-filled egg timers and gigantic silicon chips whirled by as the wind howled and Klooge twirled along. ``Nice effect, huh?'' the ghost asked. Klooge was wide-eyed with fright. ``Where are we going?'' he begged. ``We're going back to find out how you got to be such a great guy,'' the ghost said. Before he could respond, Klooge and the ghost appeared in a glass-enclosed room, the air-conditioning turned up high. A group of programmers sat in rows behind a rotund, short, jolly-faced man in a white smock. The room was dominated by an IBM 7090 and a cluster of huge card-punch batch machines. ``Why, that's old Wizziwig, my first boss,'' Klooge cried out. ``And that's me as an entry-level programmer. Wow, did I look like a nerd or what?'' ``They can't see or hear us,'' the ghost explained. Old Wizziwig stood and announced loudly to the room, ``Come, my young technoids, it's Christmas Eve and you're already cross-eyed from too much programming. Let's have some eggnog and celebrate.'' Young Klooge rushed over to Wizziwig breathlessly. ``But Mr. Wizziwig, I've almost finished this piece of code. Another three hours and I'll have it.'' ``Lighten up, Klooge,'' Wizziwig laughed. ``It's Christmas Eve, and even the accounting department is going home. It's time to enjoy the season.'' Young Klooge was visibly disappointed but straggled along to the party. There, a pretty young woman _ the data processing department secretary _ came up to him. ``Oh, Ebenezer _ God that's a weird name _ but hey, do you want to dance?'' ``Sorry, Darlene, I'm totally consumed by this I/O problem, and dancing would just cause unwanted downtime,'' young Klooge replied. ``Jeeesh, what a geek,'' Darlene sighed and turned away. Old Klooge turned to the ghost. ``Please, I can't stand it. Take me away from here.'' ``Can't stand seeing yourself blow a chance at happiness, huh?'' the ghost asked as he whisked Klooge into the night. ``Are you kidding? No. I can't stand to see a programmer wasting time at a party.'' With that, the pair landed in a boardroom in a big city several years later. Three young businessmen were meeting. Klooge recognized them as former co-workers from Wizziwig's. ``Hey, it's Edson, Neidenfer and Baker,'' Klooge marveled. ``Those guys started that billion-dollar software company. What are we doing here?'' ``Just listen,'' the ghost said. The three men were looking at a business plan. ``So do we ask Klooge to join us?'' Edson asked. ``I don't know,'' Baker said. ``He's such a jerk.'' ``Yeah, forget about him,'' added Neidenfer. ``He's a good programmer, but no one can stand him. He's so obnoxious.'' ``OK, he's out,'' Edson agreed. ``Now let's talk about venture capital.'' Klooge stood transfixed in horror. ``Ooohhh, take me outta here. I coulda had founder's stock, I coulda been rich. Oh, that hurts. I get the picture.'' Suddenly, Klooge found himself swirling forward through time, and then he landed in his bed. Before he could think about what had just happened, he fell into a deep sleep. The sound of a loud dot matrix printer awakened him with a start. Another ghost was sitting at a terminal as stacks of green bar paper poured out of a nearby printer. ``Whatsit . . .'' Klooge mumbled sleepily. ``It's just me, the Ghost of Systems Present,'' the apparition replied. ``Aw gee, and I was hoping it was John Sculley and Steve Jobs,'' an irritable Klooge retorted. ``Just my luck, a wise guy _ and on Christmas Eve, yet,'' the ghost said, looking to the ceiling. ``Why don't I get the good stuff, like It's a Wonderful Life, or Miracle on 34th Street? Oh well. Come with me, turkey.'' With that, the ghost reached out and touched Klooge's collar and they were whisked off into the night. They landed instantly outside a cottage in a run-down neighborhood. The ghost took Klooge through a wall of the cottage into a tiny, bedraggled living room. Inside, a fire glowed in the fireplace while a smiling woman sat in a rocking chair stringing popcorn for a scraggly Christmas tree. A threadbare rug covered the living room floor as three children lay on their stomachs watching reruns of The Brady Bunch. ``When will Dad and Micro Tim get home, Mom?'' one child asked. ``Soon, I hope,'' the woman replied. ``It's getting cold out there, and Micro Tim is so frail.'' ``My god,'' Klooge whispered. ``This is Bob Batchit's house. What are we doing here?'' ``You needn't whisper; they can't hear you,'' the ghost said. The door to the house suddenly opened, and in walked Batchit with his young son perched on his shoulder. Batchit was crestfallen. ``Oh, Bob, is it bad news?'' his wife asked. ``The memory upgrade cost too much,'' he responded. The family gathered around the father and child. Tears rolled down their faces. ``Poor, poor Tim,'' they cried. ``He's only got 128K on his PC. He'll never be a hacker now.'' ``It's OK, Dad. God bless us, one and all,'' said Micro Tim. Klooge watched in horror. ``Take me away from here, ghost,'' he implored. ``If only I could help them out. I could have offered some of the free cycles on the mainframe for the poor kid.'' With that, Klooge was whisked back to his bedroom. Shivering and irritated, he slipped under the covers, but he couldn't sleep. ``I wonder if Max Hopper has nights like this,'' he thought to himself. Just as he was dozing off, Klooge heard a soft but pained moaning sound. He opened one eye and saw the third ghost reading the Daily Racing Form. ``Ooooh,'' the ghost moaned. ``I can see the future, but I forgot to put $10 on Blue Moon in the sixth at Hialeah? How stupid can you get?'' ``Let me guess _ the Ghost of Systems Future?'' Klooge suggested. ``You got it, ace. Let's put on those traveling shoes.'' And with that, Klooge was whisked off into the bleak night. ``Want to know the next big buzzword?'' the ghost asked as they went. ``Not particularly,'' Klooge replied. ``Ok, fine, how about DEC's revenues for 1998?'' the ghost countered. ``How about telling me if my capital expenditures are going up for the next five years so I can write a budget?'' Klooge asked. ``No can do. The SEC keeps a close eye on stuff like that, you know.'' Suddenly, the ghost and Klooge were plunked down in Bob Batchit's living room again. The family sat sadly around the dining-room table. No one spoke. Off in the corner, a dusty Tandy PC sat next to a tiny crutch. Klooge stared in dismay. ``Oh no. Not Micro Tim . . .'' He looked at the ghost, who just looked away. ``Help me, ghost. Is this what shall be or what might be?'' Klooge begged. With that, Klooge was whisked off again. In a blink, they landed outside his office. ``What are we doing here?'' he asked. The ghost just nodded to three people in his office. One was going through Klooge's desk drawers, another was seated at his terminal, and a third was on his phone. ``Yeah, move Johnson's stuff in here this afternoon. We'll get rid of his junk this morning,'' the man said into the phone. ``So I hear the new guy is getting twice the salary,'' the first said. ``Well, he'll easily deserve it if all he does is act civil to the staff. No one is crying over Attila the Hun.'' Klooge looked wildly over to the ghost. ``Hey, spirit, this is a joke, right? Come on, good one, you got me, ha ha ha. What a kidder.'' The ghost simply shook his head and whisked the pair off to a downtrodden cemetery. A cold wind blew across the bleak sky and a single bare tree stood pathetically above the mostly abandoned graves. A pair of mobile robots with shovels were just filling in a new grave. ``Wow, robot gravediggers, great idea,'' Klooge said to the ghost. ``Just listen,'' the ghost replied. As the robots tamped down the final shovelfuls of dirt, one started to beep and buzz. ``Nobody showed up for this stiff,'' it announced. ``Manager said this one was a real loser,'' the other answered in a metallic, digitized voice. ``Let's get out of here,'' said the first. ``This place depresses my architecture.'' With that, it stuck a small headstone in the dirt and they wheeled off. The ghost pointed to the headstone, and Klooge, shivering and scared, got on his knees to get a closer look. ``Oh, no,'' he wailed as he cupped his face in his hands. ``What a drag.'' On the headstone was one word: KLOOGE ``Wow, I was hoping to retire to Miami Beach,'' Klooge said to the ghost. With that, the ghost lifted his arms and Klooge zipped away back to his bed. ``Zowee,'' he yelled. ``I'm back.'' He opened his window and leaned out. Down below was a young lad with his collar turned up against the cold. ``You, boy!'' Klooge called. ``What day is this?'' ``Where are you from _ Mongolia? It's Christmas!'' the boy yelled back. ``What a nice boy, bright, spirited,'' Klooge said. ``Listen, wise guy, I know it's Christmas. You don't know what kind of night I just had. Anyway, you know the Businessland store down on Grange? I want you to run down there and buy a PS/2 Model 50 with a laser printer and have it delivered to the Batchit house at this address. Buy a Nintendo game for yourself.'' With that, Klooge dropped his Mastercard down to the boy. ``How do you know I won't take off and charge a new BMW with this?'' the boy called. ``Oh my good lad, it's Christmas! The world is beautiful, I'm happy to be alive. And besides, I'd find you and break your little arm. Now be off with you.'' Klooge ran to his closet to dress. He hummed loudly to himself. ``Oh, it feels so good to not be a jerk. I can't wait to get to Batchit's house and see the look on Micro Tim's face when he sees the new computer.'' Klooge walked joyfully through the streets, wishing everyone a Merry Christmas. When he got to Batchit's cottage, he saw the delivery van pulling away. The family leapt for joy when Klooge came in their house. Bob Batchit came up to him. ``Mr. Klooge, this is so generous, so unexpected. What . . .'' ``Don't say anything, Batchit. Just enjoy it. I want that boy of yours to become a world-class hacker,'' Klooge said. ``I've turned over a new leaf,'' he bubbled on. ``I'm going to order workstations for everyone in the company. I'm going to give up control of the network. I'm going to offer more applications to the end users. I'm going to DE-centralize.'' ``Wow, and are you going to admit that your five-year strategic IS plan was a flop?'' ``Don't get carried away, Batchit. This `good will' stuff only goes so far. '' By Glenn Rifkin; Rifkin is a Computerworld senior editor. <<<>>> Title : manufacturing process. Author : CW Staff Source : CW Comm FileName: jcimcast Date : Jan 9, 1989 Text: manufacturing process. Vendors are just starting to come up with CIM solutions on this higher, more complex level. First, they are aggressively entering alliances with other hardware vendors, particularly controller vendors, according to AMR Vice-President Bruce Richardson. ``The goal is not cell control but cell integration,'' he says. ``If we're building vitamins or cars, we don't want to walk down to the factory floor with the ordering slip.'' Mainframe scheduling systems need to be linked to area controllers that coordinate production on the factory floor ``so no order is done until the day it's due,'' Richardson says. Last fall, DEC and Allen-Bradley Co. announced their Pyramid Integrator. The device comprises an Allen-Bradley cell controller, which provides links to shop floor devices, and a DEC Microvax processor module, which provides integration with plant and area management applications that typically run on DEC systems. Another recent alliance for integration was formed between Motorola Computer X, Inc. _ a Motorola, Inc. subsidiary _ and Stratus Computer, Inc. Computer X offers a hardware platform for coordinating the different cells on a factory floor, each of which can be working on a different part of a product or process. Keeping track Stratus sells its fault-tolerant hosts as area controllers that can coordinate activities across the plant; it also provides links to design and engineering workstations and IBM hosts that often run administrative and scheduling applications such as manufacturing resource planning. Computer vendors are also reaching out to the CAD/computer-aided manufacturing side by supporting manufacturing's dominant communications protocols. Both IBM and DEC, for example, have announced support for Sun Microsystems, Inc.'s Network File System and Apollo Computer, Inc.'s Network Computing System. Apollo and Sun have reciprocated by supporting Decnet and Systems Network Architecture. As a result, Sun engineering workstations can send design documents directly to IBM scheduling and bill of materials systems. But customers want more than just a physical link in order for graphics documents to be sent to the shop floor in a useful form, according to Donald Bell-Irving, a manager of DEC's CIM applications marketing group. ``You need low-cost graphics devices on the shop floor and a fast LAN; but even more, you need a compound document technology'' that allows users to append routing slips, comments and changes to a design as it moves from one manufacturing area to the next, he explains. ``That technology is just becoming feasible.'' The big computer companies' other major marketing strategy is the CIM software platform _ a common user interface, communications and database environment that runs on their favorite hardware platforms (see story at right). They are also working with software companies to produce applications and development tools for their platforms. The software platforms address what Anthony Klemmer, a vice-president at ITP Boston Inc., a Cambridge, Mass., consulting company, refers to as a ``technology backlash'' against customized CIM applications that were too costly to be practical. ``If you can integrate and automate in pieces a platform that allows you to reuse software, standardize additions and thus lower costs, it is well worth it,'' he says. Several users, however, say that this latest wave of ``open'' software interfaces is too recent for them to judge its usefulness to their CIM plans. Some worry that if they choose one vendor's software platform, they will not be able to install those devices that do not support it. What to do The solution to this problem is for the host vendors to migrate their software platforms to industry standards such as MAP, according to James Caie, director of manufacturing systems at GM. Caie was in charge of CIM at the famous Saginaw plant. This migration would be a benefit to CIM implementors, since the platforms would provide something that MAP currently lacks: a standardized foundation for application development. Rather than wait for vendors to come out with more complete CIM standards and better development tools, many manufacturers are going ahead with CIM implementations _ with development plans carefully geared to take advantage of what is out there now, both on vendors' shelves and within their own installations. Savvy IS managers are also tying their CIM plans to specific business goals, rather than taking a wholesale approach to the technology. For example, rocketing interest rates and a depressed economy in the early 1980s drove Deere to find ways to cut inventory and production costs via CIM, according to David Scott, manager of marketing for Deere Tech Services. ``When interest rates are 22.5%, inventory starts eating up a lot of cash.'' The manufacturer implemented a top-down plan with a bottom-up implementation that ``made use of existing equipment and didn't go overboard putting technology in for technology's sake,'' Scott emphasized. The payback: Deere just announced record earnings for 1988, and its factories can continue to turn a profit even running at 30% to 35% capacity. Deere was among the pioneers, but a number of other companies are turning to CIM. Barring catastrophes in the manufacturing sector and the U.S. economy in general, we should see a snowball effect in the CIM world in the next few years. As the new wave of CIM implementors report the paybacks they have gleaned, yet more companies will be encouraged to start their own installations. This, in turn, will cause vendors to accelerate their efforts to provide better products and tools to gain greater shares of the burgeoning CIM market. While the term ``computer-integrated manufacturing'' was coined only a few years ago, the networking, software and computing technologies that serve as CIM's foundation have been developing for decades. This should reassure IS managers who do not like to bet their budgets on untested products. While the leading edge of CIM technology may be vaporware, there is now a solid body of available products to get implementors going. It's time to take the plunge. <<<>>> Title : Normalizing not only way Author : George Coleman Source : CW Comm FileName: sftlin0 Date : Jan 9, 1989 Text: In recent months, my name has been mentioned twice in this column regarding a study on data normalization that I conducted [CW, Oct. 17 and Nov. 28]. This leads me to address several points. First, I certainly did not work alone on the normalization study. It was a joint effort with David Young of Amdahl in Sunnyvale, Calif., and others assisted as well. Furthermore, the normalization experiment was only one of several performance aspects of IBM's DB2 that we measured. As to the issue of whether denormalization is good or bad, the truth is that almost all experienced mainframe database application developers recognize that normalization is not the last step in the process of database design. If a database application does not perform at acceptable levels of response time, and if attempts at tuning it or ``throwing hardware at it'' fail to solve the problem, the application must be redesigned. This usually means _ and this is especially true in on-line realization applications _ that some data has to be denormalized. In his column, Remon Lapid cited some excellent reasons that denormalization can cause problems; all the reasons concern changing data or the structure of data. But that does not mean that all physical data designs should be normalized. It does mean that decisions concerning denormalization must be made with care. Anyone who understands the data can create a normalized design; denormalizing data also requires an understanding of how it is used. Relational database applications can perform well, but with today's technology, this can happen only when the number of blocks of data that any transaction accesses is controlled. This control may occur naturally with the process the transaction must perform _ for example, display a specific record on the user's screen _ or may occur in the table design, as when all tables are sufficiently small. Otherwise, the control must be artificially built into the application program or the table design. Quite often in the mainframe environment, a normalized design does not perform well, even when the cache memory or buffer pool is increased or the speed of the processor is increased. The only choice then is to denormalize the data somewhere. As Lapid made clear, denormalization decisions usually involve trade-offs between flexibility and performance. Such decisions are not always made easily. In fact, denormalization is really an extension of the art of normalization. Normalization requires an understanding of the data, but intelligent denormalization also requires an understanding of the flexibility requirements of the data; awareness of the update frequency of the data; and knowledge of how the database management system, the operating system and the hardware might work together to deliver optimal performance. The simple truth is that denormalization is an economical and technological necessity; it must be seriously studied. A well-performing database cannot be designed strictly within the confines of relational theory. It can only be accomplished with an awareness of the total environment in which it must operate. By George Coleman; Coleman is a staff systems engineer for Amdahl Corp. in Chicago. <<<>>> Title : Broadway says that if the Author : CW Staff Source : CW Comm FileName: jjbillfo Date : Jan 9, 1989 Text: Broadway says that if the PS/2s cannot be integrated into the mainframe environment, they become a ``dead issue,'' or virtually useless for her integration purposes. ``I guess I'll have to find somewhere in the company to put these PS/2s.'' ``The PC market will experience specialization, a segmentation by brand into different product applications,'' IDC's Stephen predicts. ``With stand-alone desktops running on Intel 80386 processors, the market cannot really equate these products as standard personal computers.'' Most analysts say they agree that these machines, currently based on the 80386 processor and soon to include the 80486, are a bit much to put on one individual's desktop. Another factor that enters into the overall picture is the desktop cost structure. IBM Personal System/2 Model 70 and 80 machines can range anywhere between $6,000 and $10,000. Stephen estimates that machines built around the Extended Industry Standard Architecture, an alternative bus built by IBM's Personal Computer rivals to compete against the Micro Channel Architecture (MCA), will start in price at the $12,000 to $15,000 range. But currently, the majority of the million-plus MCA-based PS/2s at user sites are in stand-alone configurations. This means that most of these users are operating with, on average, $8,000 of hardware alone. Considering that few software or hardware products are available to take advantage of the PS/2 architecture, it is a safe assumption that these machines are currently not being utilized to their full potential. Common sense prevails But even in the midst of some very confusing predicaments, there is a light in the darkness: common sense. As companies try to be more competitive by using technology, MIS must consider that employees are using the PCs as tools, not to do their jobs for them. Common sense calls for MIS to deliver PCs that are appropriate for the users' tasks and to ensure that the machines can be integrated into the larger system. If the PC cannot be used to assist the employees' productivity, the purchase works against _ not for _ its business purpose. <<<>>> Title : Keeping an eye on all the Author : CW Staff Source : CW Comm FileName: highup Date : Jan 9, 1989 Text: From a distance, the images resemble the impressionistic brushwork of a Degas or Cezanne. Splashes of reds tangle with blues. Greens and yellows pepper a brown field. But move a little closer and the amorphous shapes slowly make an odd sense. Harbors, cornfields, houses _ even the white line running down the center of an airport runway _ appear like figures through a haze. The bizarre and beautiful images are the result of satellite imaging using a remote-sensing application built around a Digital Equipment Corp. Vaxcluster system at the Spot Image Corp. At the nerve center of Spot's headquarters in Toulouse, France, are three VAX processors _ an 8530, VAX-11/750 and 11/785 _ and two DEC HSC50 storage controllers. Every surface of the earth receives solar radiation that is either reflected or absorbed and emitted in specific wavelengths. These spectral signals are recorded by satellite-mounted high-resolution sensors sent up by Spot Image, with the DEC iron shaping the gathered information into photograph-like images. Spot's remote sensing techniques have already hit it big in several areas. Geologists can interpret the subtle variations in color to discover depression patterns and aid exploration, city planners can use it to make maps that are more precise, and the military can get a high-flying view of the world's hot spots. The process begins when data is garnered in ``image swaths'' that represent a section of the earth 60 kilometers wide and as much as 12,000 kilometers long, said Clark Nelson, director of corporate communications at Spot Image's U.S. subsidiary in Reston, Va. Data is then transmitted from the satellite to one of several Decnet-linked receiving stations positioned around the world. After the datastream is recorded on high-density digital tapes (HDDT) at 50M bit/sec., the tapes are transported to another Spot Image facility for photoprocessing. When the firm receives a customer order, it reviews its catalog of HDDTs. If the image is not available, the satellite must be positioned to image the appropriate area. If the imagery is already available on tape, the production process begins. The appropriate portion of the HDDT is located and transferred at a rate of 400K bit/sec. to a VAX processor, where its contents are decoded. The raw information is then fine-tuned and enhanced to remove distortions caused by the earth's curvature. Once corrected, the processor will store the output on either 1,600 or 6.25K bit/in. computer tapes; it is then directed to Vizir, a laser film writer from the French firm Sepimage, to produce photographic prints or transparencies. The Vizir is also controlled by the Vaxcluster system, which instructs the film machine on how to translate digital information and render the image. The process takes about 10 minutes. While the federal government accounted for nearly 70% of Spot Image's business in 1987, Nelson said that figure has been reversed so that seven of 10 sales came from the private sector in 1988. ``It has ceased to be a highly specialized requirement,'' Nelson said. By James Daly, CW staff <<<>>> Title : Britton Lee seeking new i Author : CW Staff Source : CW Comm FileName: softies1 Date : Jan 9, 1989 Text: The name Britton Lee will take a low profile as part of a new marketing strategy being undertaken by the Los Gatos, Calif.-based vendor of database hardware and software. Britton Lee, Inc. recently renamed its product line Sharebase and said the company will henceforth go by the trade name Sharebase From Britton Lee, Inc. The new business identity program is intended to stress the company's production-oriented SQL relational database management software as its principal product. The firm is renaming itself and its products to emphasize that it supplies both hardware and software, said President and CEO John Cavalier. The International DB2 Users Group was founded recently in Chicago. Intended for DB2 users regardless of mainframe platform, the organization was created to provide a forum for exchanging ideas and experiences and increasing productivity for DB2 users internationally, the group said in a release. The users group's 1989 conference chairman is Ken Paris of Peat, Marwick, Main & Co. in Montvale, N.J. The group's first international conference will be July 9-12 at the Hyatt Regency in Chicago. As the Unix dice roll, Oracle Corp. is hedging its bets. The Belmont, Calif.-based vendor of DBMS software and services said it has joined both the Open Software Foundation (OSF) and Unix International, Inc. Unix International, once called the Archer Group, is led by AT&T and seeks to establish its Unix System V as a standard. OSF is building its environment around AIX, IBM's version of Unix. <<<>>> Title : AI system to monitor Sout Author : CW Staff Source : CW Comm FileName: aaii Date : Jan 9, 1989 Text: Once the property of the academic and scientist, artificial intelligence was for many years a technology in search of a use. That is changing: Soon, everyperson who turns on an electric light in Los Angeles will depend on an expert system for their power. Southern California Edison in Rosemead, Calif., one of the top five utilities in the U.S., is building an expert system to predict summer electrical consumption in the greater Los Angeles area. The load forecaster is intended to anticipate maximum electrical consumption 24 hours in advance of use. ``That is important because you need the resources for the next day's load. Otherwise, you'll have a blackout or brownout,'' said Mark Samaha, an applications engineer at the utility who is working on the system. Samaha has been hard at work recording the wisdom of a human expert, around which he is building the system. If all goes well, the computerized expert will be lending its advice to the utility by this summer. Copycats Knowledge-based systems mimic human problem solving; rules are applied to data to reach a conclusion much the same way an expert would reason through a problem. An expert system typically consists of the knowledge base of stored expertise, an inference or reasoning engine that executes the rules and a human interface component through which the user accesses the system. Samaha is using AI Corp.'s Knowledge Base Management Systems (KBMS) tools to create his application. KBMS is written in C and is intended for IBM mainframes. Southern California Edison will run its application on an IBM 3084 mainframe. Samaha interviewed his expert, a senior system operations engineer, for a total of 16 hours and derived a total of 650 rules that govern the predicting of power needs. ``The [human expert's] knowledge is at a very high premium. There are fewer than four or five individuals that could do his job. The job has demands unique to Southern California,'' said Samaha, explaining that an expert imported from New York would be sensitive to different variables and could not function as well. Watching the heat The summer weather is a major determinant of the region's power requirements, and rules are typically applied to weather information. In a rudimentary example, the expert looks at a weather report for the next 24 hours. If the prediction is for 95 degrees and high humidity, compared with 90 degrees with pleasant humidity levels today, then the area's power plants must produce a certain amount of additional electricity. During the coming summer, both the human expert and the computer expert will be used in parallel and their predictions compared against actual needs. The results will be used to fine-tune the system, Samaha said. The $50,000 system should pay for itself after its first summer of use, he said. By Stanley Gibson and Amy Cortese, CW staff <<<>>> Title : Waling on a Blue whale Author : Douglas Barney Source : CW Comm FileName: dcol Date : Jan 9, 1989 Text: Big Blue whale feast: IBM's Personal Computer strategy may be starting to work too well. Sick of getting trounced in the commodity-style PC price wars, the company fashioned a great plan. It got itself a new bus architecture and lost a whole lot of market share. Like a pool hustler letting the bumpkins win for a while, IBM made its competitors over-confident. They thought the industry giant was done for as a major PC player. Their mistake was thinking that the rules and ruler of the PC game had changed. Meanwhile IBM, as the only Micro Channel supplier, was feverishly locking competitors out of its key accounts. Then competitors made the bigger mistake of taking on IBM in public and then mimicking the IBM architecture with the Extended Industry Standard Architecture (EISA) bus. These folks didn't realize that you don't beat a market dominator with a slap in the face. Instead, you fall in line and do exactly what it does _ except do it better. Departing from IBM's PC bus direction was a major blunder, but it may not be too late to return to the old ways of competing with IBM. It's simple. Let IBM set the standard (the Micro Channel) and then go see someone, such as Phoenix Technologies, that will help you legally duplicate it. Next, kick the hell out of IBM with lower prices, more features and a friendlier approach. Meanwhile, keep the older stuff on the market and sell it cheap. Do it right, and users will love you while IBM learns to hate you. Cloners should remember that IBM is still a big Blue whale. It's strong all right, but it doesn't turn so well. With the right strategy, a few PC sharks can easily slip underneath that Blue wad of blubber and do what cloners used to do _ snack on IBM's entrails. But cloners need sharper teeth, like cheaper Micro Channel machines, as well as an aggressive stance toward the future. Then we can all sit down and enjoy a nice meal at IBM's expense. Tandy did it. Critics have been speculating that IBM's Micro Channel Architecture was impossible to fully clone and argue that so-called bus-mastering products would make the clones crumble. That has led to speculation that EISA was formed because cloners realized the Micro Channel was impossible to duplicate. That is apparently not the case with the Tandy 5000MC. Phoenix Technologies, which helps compatibles be compatible, has been testing the 5000 with bus-mastering products and reports that all's well that runs well. If Phoenix says the Micro Channel is clonable, then it is. The straight story on Magellan. Lotus has never done well selling software that lists for less than $495, but it hasn't given up trying. Early next year, the company will announce and ship (now there's a novel idea) Magellan, a $149 package that provides users with a more intuitive way of searching through files for information. With Magellan, users can enter a number of words that may be found in the file. After locating the right file, users can bring it up, along with the application, with one keystroke. Magellan also includes DOS shell capabilities, a macro language and tree-like directories. By Douglas Barney; Barney is a Computerworld senior editor, microcomputing. <<<>>> Title : Intex's Panorama lets you Author : CW Staff Source : CW Comm FileName: panorama Date : Jan 9, 1989 Text: Spreadsheet users who want to see the cause and effect of their transactions can now use a product that gives them the whole picture. Offered by Intex Solutions, Inc. in Needham, Mass., Panorama enables the viewer to pan a screen to observe all data movement and transactions as they occur. The package also enlarges the viewing capacity on the screen to include zoom windows. The windows display the accounts that will be affected by the given spreadsheet transaction. Users can simultaneously see where their data block has gone and which account has been affected by any particular spreadsheet manipulation. For example, if a 15% salary increase is given to an employee, the user can see how the increase directly influences cash flow, the balance sheet and anticipated annual income budgets in the zoom window while also viewing the 1-2-3 spreadsheet. Panorama also includes a mouse option. The package includes its own driver and runs with IBM's Color Graphics Adapter, Enhanced Graphics Adapter, Video Graphics Array and Hercules Computer Technology, Inc. and Compaq Computer Corp. screens. It is compatible with Lotus Development Corp.'s 1-2-3 Version 2.0 and higher and all versions of Lotus' Symphony. The product is currently priced at $95 plus shipping but is scheduled to rise to $145 beginning in May. By William Brandel, CW staff <<<>>> Title : AT&T grooms PBXs for data Author : CW Staff Source : CW Comm FileName: attpbx Date : Jan 9, 1989 Text: SANTA BARBARA, Calif. _ While IBM has apparently cast off its Rolm Systems division PBXs from its ISDN and data networking strategies, AT&T recently laid out a strategy for providing data networking via its PBXs and ISDN. The advent of Integrated Services Digital Network should boost the private branch exchange's credibility ``as local arbiter of high-speed communication between PC and data center,'' said Thomas Nolle, president of Haddonfield, N.J., research firm CIMI Corp. PBX vendors have been unable to sell this idea in the past because PBXs were less effective than local-area networks for local communications and could not provide an effective long-distance data link, according to Nolle. Now however, PBXs provide 64K bit/sec. ISDN links to ``someone at a remote site who is not on your LAN,'' he said. At a meeting here with the AT&T Digital Multiplexed Interface/ISDN Users Group, an organization of computer and networking vendors, the vendor announced plans to publish an interface for writing ISDN data applications that would link IBM Personal Computers via its System 75 and 85 PBX systems. Based on AT&T's Digital Communications Protocol (DCP), the interface is a focal point of the vendor's strategy to position its PBXs as links between PCs at a local site and ISDN wide-area networks, according to Roger Boyce, AT&T product manager of PC-PBX connections. IBM PCs equipped with the DCP interface card can use the PBX to gain access to AT&T's Primary Rate ISDN service, which permits the user to ``send files at 64K bit/sec. ISDN speeds from, say, here to Denver,'' Boyce said. Such a configuration is more cost-justifiable because the same PBX that handles PC data transfers can also be routing voice transmissions between sites over the same ISDN link, he added. Fooling the applications DCP is an existing AT&T interface for linking data equipment over the vendor's PBXs that supports the same 64K bit/sec. channels as ISDN. While DCP uses a different signaling protocol than ISDN, ``applications don't see the difference,'' Boyce said. AT&T sees the PBX-ISDN configuration as complementing rather than replacing LANs or dedicated mainframe-to-workstation links, Boyce said. ``If you're doing distributed database applications, you probably won't use a PBX.'' But by providing PC users with a 64K bit/sec. ISDN link on demand, the PBX offers a cost-effective long-distance connection for sites that cannot cost-justify a dedicated connection or T1 multiplexer, he added. AT&T plans to release the DCP specifications next spring to selected vendors with popular PC communications software packages, Boyce said. The company said it hopes, for example, to gain DCP support from Hayes Microcomputer Products, Inc.'s AT modem protocol, IBM Netbios LAN products and Digital Communications Associates, Inc.'s micro-to-mainframe software, Boyce indicated. IBM's apparent withdrawal from the PBX arena may give AT&T more room to sell its DCP concept to IBM installations. By Elisabeth Horwitt, CW staff <<<>>> Title : Netwise to fashion tools Author : CW Staff Source : CW Comm FileName: netwise Date : Jan 9, 1989 Text: Netwise, Inc. in Boulder, Colo., has agreed to provide Novell, Inc. with tools for developing applications for Novell's Netware LAN Server, according to a nonexclusive marketing and development agreement between the two companies. The products, which Novell has agreed to market, will include Netware versions of Netwise's existing tools, which are said to facilitate the development of distributed applications by generating remote procedure calls (RPC). Netwise plans to develop other RPC products for specific Netware-supported environments and protocols. For example, support for Novell's SPX protocol in the DOS and OS/2 environments is scheduled for a first-quarter release. Similar support for Digital Equipment Corp.'s VMS is also planned, the vendor said. RPC essentially supercedes Novell's much ballyhooed value-added processors (VAP). In an interview, Mark Caulkins, vice-president of Novell's software group, conceded that the VAP was difficult to use. The RPC will make it easier to write to Netware, but it still will not be as easy as writing to OS/2, he said. By Patricia Keefe, CW staff <<<>>> Title : Bells can soon send at 15 Author : CW Staff Source : CW Comm FileName: bisdn Date : Jan 9, 1989 Text: LIVINGSTON, N.J. _ Bellcore has announced the successful completion of feasibility testing for a prototype technology that could enable Bell operating companies to provide transmission services at up to 155M bit/sec. Switched Multimegabit Data Service (SMDS) is a proposed Bellcore public network-based service designed to extend local-area network and other high-speed local communications links over the local loop, the Bell operating company research and development subsidiary said. If widely deployed by regional carriers, the technology could accelerate the growth of distributed processing applications, intercompany links such as electronic data interchange and LAN-to-LAN bridging, Bellcore said. The organization has already demonstrated an SMDS prototype application for LAN-to-LAN bridging. By providing standardized high-speed switched services, SMDS technology would address the needs of large corporate users who find the current 64K bit/sec. version of ISDN ``vastly inadequate in terms of bandwidth,'' according to Kenneth Phillips, chairman of the committee of Corporate Telecommunications Users. First services SMDS services are likely to be the first offerings made by the Bell operating companies that support the broadband Integrated Services Digital Network, according to Glenn Estes, a district manager at Bellcore. Broadband ISDN is a still-embryonic version of the telecommunications standard that could support up to 200M bit/sec. ``ISDN typically defines interfaces and protocols'' for a wide range of transmissions, including voice, data and video, Estes said. In contrast, SMDS fully defines a data-only service. In the interim period before broadband ISDN becomes a usable standard, SMDS services are likely to deploy other technologies, such as the ANSI-based 802.6 standard for metropolitan-area networking, Estes said. By offering such speeds over a switched network shared by multiple customers, the Bell operating companies will save users money when sending over dedicated high-speed lines, Bellcore said. SMDS services could be deployed within local public networks within the next two to four years, depending on availability of network equipment, Bellcore said. Bellcore would not provide a date for its release of SMDS specifications to vendors for implementation in communications equipment. Nor would it reveal which, if any, of the Bell operating companies are currently contemplating the announcement of SDMS-based services. By Elisabeth Horwitt, CW staff <<<>>> Title : Factories need to shed cu Author : CW Staff Source : CW Comm FileName: amrstudy Date : Jan 9, 1989 Text: High up on the wish lists of manufacturing information systems and plant floor managers are development tools and common application interfaces that minimize the need to write customized code for multivendor connectivity, according to a recent study by Advanced Manufacturing Research, Inc. (AMR). The Cambridge, Mass., research firm interviewed managers from 150 Fortune 500 companies about how they were coping with factory floor communications and what they would like to see in the way of new developments and products. Approximately 23% of respondents had installed some version of a Manufacturing Automation Protocol (MAP) network, AMR said. Most were in pilot sites with eight or fewer nodes, with applications split between intercell and real-time support. A majority of users were still using proprietary networks, citing high costs, fear of pioneering and lack of vendor support as reasons for not going with MAP, AMR said. The study found two distinct computing environments within manufacturing firms. Plant floor managers focused on real-time and process applications and on networking cells. Cells are groups of shop floor devices and controllers that work on the same process or product line. Only 30% of respondents were experimenting with LANs for intracell communications; the rest used some kind of point-to-point proprietary link between devices within a cell. The MIS group focused on linking cell controllers, general-purpose computers and terminals onto a backbone network. Materials flow was typically handled by cell controllers communicating with one another as peers and also with a general-purpose host for overall coordination, respondents said. While 77% of the respondents supported multivendor computing standard Open Systems Interconnect (OSI), more than 60% of the respondents had established a primary host computing platform, typically based on IBM, Hewlett-Packard Co. or Digital Equipment Corp. systems, AMR said. Users look to the vendor The conclusion was that while users looked to OSI for eventual multivendor connectivity, they expected their primary vendor to help them migrate to the standard. IS departments at advanced companies are ``looking beyond communications and . . . seeking a common application environment providing a common user interface, database access and programming interface,'' which would run across a variety of computing and networking systems, AMR said. Such a platform could eventually make networking protocol issues obsolete, by allowing the user to access computing resources without worrying about what type of system or network they are on, AMR said. By Elisabeth Horwitt, CW staff <<<>>> Title : A downsizing checklist Author : Robert A. Zawack Source : CW Comm FileName: zawcol Date : Jan 9, 1989 Text: Assemble 35 human resources executives from information systems organizations and you're sure to get a free-flowing discussion about how to handle a variety of challenges. That certainly was the case in a recent symposium that I co-sponsored in which such a group of IS human resources experts shared their experiences and opinions. The common concerns that emerged related to topics such as downsizing, innovative reward systems, mentoring, strategic human resources planning, customer service, performance appraisal, innovation and the question of hiring IS majors or general business majors from colleges. In addition, there was general agreement on a string of recommendations relating to one key topic _ downsizing _ with which many participants had recent experience. Their checklist for downsizing in IS includes many items: If you plan to cut 30%, cut at least 35% so you can get the unrest behind the organization and move forward. Always make your first cut deep enough to prevent future unanticipated cuts. A management team probably cannot survive two cuts _ its credibility will be shot. If you are going to cut 35%, make the cuts at all levels, including the direct reports to the IS director. In every IS organization I have been in, a complete level of middle management could have been eliminated with the organization actually becoming more effective. We will see very flat IS organizations in the future. Before you cut, define your core technology people. These are people you must have to move forward after the cut. Consider giving these employees bonuses or else the best among them will leave. Consider downsizing your management ranks as an opportunity to evaluate your managers. Although it is painful, most IS organizations believe the net effect of this action is healthy because it forces decisions that should have been made months or years ago. Use performance tools and supplement them with a rating form that evaluates managers on their competencies and skills. When you are evaluating individual contributors, consider their motivation in addition to their skills, aptitude, experience and past performance. Take care to remember that programmers can have all the skill in the world, but they will still not be capable of producing up to their potential if they are not motivated. Evaluate the cut list for the impact on protected classes of people. Communicate decisions as soon as possible. This must be done face to face by the immediate manager. The grapevine will already have alerted people to strange activities in the organization, and people will have a fairly good feel for the downsizing. Remember, research indicates the grapevine is 80% accurate. Have a system established to out-process the people who are cut. If resources permit, provide assistance with placement counseling. Evaluate your management development program. After downsizing, management must send a positive message that this is the team for the future. Then there should be a training program that reinforces that message. By Robert A. Zawacki: Zawacki is president of Zawacki & Associates, an information systems consulting firm in Colorado Springs. <<<>>> Title : Stripping down IS Author : CW Staff Source : CW Comm FileName: playboy Date : Jan 9, 1989 Text: In his company's heyday, Playboy Enterprises, Inc. data processing director John Close was rolling in money to spend on new technology. Close built up a multinational system to run everything from gaming operations in the Caribbean to the company's own credit card transaction system. After 20 years with Playboy, Close is now in charge of tearing down the systems he built. He says he is not sad about the demise of the centralized data center. Instead, he hopes downsizing will provide new business opportunities for the company. ``In the late 1960s and early 1970s, we were trying to find ways to deal with the money. We couldn't make decisions that didn't work. We were trying to stay in the lead with technology, but it wasn't so much of a planning process as a reactive process,'' Close says. ``As the social aspects of the country changed, the magazine had less of an impact on the way the country thought. That's when the decline started.'' After suffering a 50% cut in budget in the last five years, Close decided the only way to help the company survive in the long run was to dismantle his mainframe shop. He expects to finish the job and have applications running on minicomputers and personal computers and in service bureaus by spring. The host IBM 4341 system will be gone. Close is undecided about his personal plans after the phaseout is complete. Running an empire Close built the systems to handle all of Playboy's computer needs. There were limousine services, modeling agencies, Playboy Clubs and casinos, resorts, the Playboy cable channel and, of course, Playboy magazine. ``For example,'' Close says, ``for the modeling agency, we provide models throughout the world, with each transaction handled through an IBM 3270-type terminal _ everything from accounting to scheduling, access to external databases, making airline reservations, spreadsheets, development packages _ that kind of thing. Playboy magazine uses the system primarily for accounting and financial services.'' The Boulder, Colo., IS operation serviced international operations as well as corporate offices in Chicago, Los Angeles and New York, each of which have representatives from all of Playboy's businesses. These offices also are serviced in-house by PCs and one mid-range system in Chicago. Close's position in the hierarchy indicates the importance Playboy puts on information systems: He reports to the chief financial officer, who in turn reports to the chief executive officer. For decades, that CEO position was held by Hugh Hefner, whom Close calls ``Hef,'' and was recently turned over to Hefner's daughter Christie, who has been the company's chief operating officer since 1984. The party's over Playboy is run as a business, Close insists. ``At least, that's always been the attempt,'' he adds. Close says that when he started, the company was more into the party scene, ``but in the early 1970s, we began to see that change.'' Since the early 1980s, the company has sold off most of its assets, such as the Playboy Clubs and resorts. In cutting costs, MIS cannot be eliminated, but Close says it has been squeezed more than other areas. ``As you're trying to make sink-or-swim decisions for the company, MIS gets to the point where it can get squeezed more than anything else, particularly as those decisions become less dependent on information and more dependent on trying to save the organization.'' At its peak, the MIS department employed 80 people. Now it is down to less than half that. Close says that most applications were developed in-house until the early 1980s. He adds, ``Now most of the applications have been acquired from OEM people.'' In the new environment, Close says, there will be minimum development, just typical maintenance operations and interface with outside software developers. The host mainframe in Boulder will be eliminated, and the site will be shuttered. ``Large applications, ones that are too large for our proposed direction, will go off-site into a service-bureau-type environment, using the base that we've already built on VM. We're going to draw upon a System/38 Model 700 in the Chicago office to do limited processing _ a lot of the batch work. Our PCs will do the very small operations that we plan to move over,'' Close says. The first office to get new treatment will be in Los Angeles. At this point, Close is identifying applications that are too large to run in-house. He says, ``We take those things that can be done locally on minis and micros.'' Close says he is limiting applications going to a service bureau to only the ``pieces with heavy CPU demand, which are very intricate in the way they're written.'' Los Angeles will be a pilot site this spring, Close says, because of its cable television production capacity. ``Since that is an intense database-driven kind of operation, we want to support it in the same manner they've been accustomed to.'' Los Angeles will be a learning experience for Close, and, he says, ``That process can be applied to any new applications'' in other offices. Downsizing will have its costs, Close admits. He hopes to offset much of it by selling equipment, but he does not expect to reap much from the resale value. ``We'll be lucky to break even,'' he says. Close hopes to cut the cost of Playboy's IS by 25%. Despite the preponderance of Playboy magazines mixed in with computer publications in his office and the fact that he gets to call the former CEO by his nickname, Close maintains that Playboy is a business like any other, and IS can still be a key to its continued survival. <<<>>> Title : A strategic plan for the Author : CW Staff Source : CW Comm FileName: train02 Date : Jan 9, 1989 Text: As we approach the 1990s, with computers and communications woven tightly into the organizational fabric and people costs exceeding 85% of most information systems budgets, it is clear that enterprises must introduce another layer of strategic planning. This layer relates to the ongoing professional development of the MIS staff and managers. Two aspects of the planning are essential. First, education and training specialists should understand the growing importance of their roles. They should learn to view themselves and their responsibilities in a different light as agents of change and developers of minds. Training and education traditionally have been viewed as a fifth wheel or, at best, an add-on in all but the most progressive enterprises. Professional development must be viewed as an essential MIS activity. Second, the formulation and annual update of the strategic plan for MIS professional development must be consistent with the strategic plans of both the enterprise and MIS. Paul Strassmann, former vice-president of information systems at Xerox Corp., has said, ``You must consider the strategy before you design the structure.'' This observation applies to professional development organizations and to enterprises themselves. In formulating a strategic plan for MIS professional development, one must start with a mission statement for the organization. This statement is a prerequisite for the development of plans at three levels: strategic, tactical and operational. At each level, one must identify issues and objectives before defining the elements of the plan. In this top-down model, the plan elements at the strategic and tactical levels drive formulation at the lower levels. How it might be For example, a strategic issue might be ``user involvement in systems development,'' leading to the strategic objective of defining and encouraging more user involvement. This objective could in turn generate plan elements such as understanding benefits and tradeoffs of user involvement, classifying users with respect to needs and educating users in applications development. On the tactical level, the corresponding issue might be ``to define user involvement that reflects the systems development approach and a user's needs.'' The tactical objective might be to outline procedures that foster greater user involvement. The tactical plan elements then might include defining new user responsibilities, assessing education and training needs and identifying solutions to communication problems between users and MIS. On the operational level, the issue may be ``to develop procedures for new and increased user involvement,'' leading to the objective of formulating tasks, priorities, budgets and resources. The elements of the operational plan might be specifying knowledge and skill requirements, acquiring reference materials, conducting work sessions with user and IS managers and formulating a training budget for users. In forming and updating a professional development plan, managers must understand the intended audience. They also should be involved in the creation of the MIS strategic plan, because people are at least as important to that plan as hardware and software and certainly far more expensive. Indeed, the MIS plan, and ideally the enterprisewide plan, should be regarded as a feedback loop _ that is, human resources should be part of the MIS plan and the MIS plan must feed the professional development strategic plan. A strategic plan for MIS professional development should help ingrain into the organization such relevant education and training issues as cost-benefit analysis, the roles of various media, management development practices and the need to sell professional development to management. A professional development plan should also reflect an understanding of the technical environment as well as management needs. The latter could include the direction and priorities of the organization, competition, budgets, recruitment and the roles of information and MIS. The plan should also reflect the need for interpersonal skills in areas such as instruction of adults, communicating with users and management of changes. Organizations must undertake rigorous planning that reflects the growing importance of information systems in corporate and government organizations, the dependence of those information systems on people and the high cost of people. By lawrence K. Grodman, Special to CW; Grodman is president of QED Informantion Sciences, Inc. in Wellesley, Mass. <<<>>> Title : Hiring to remain cautious Author : CW Staff Source : CW Comm FileName: careers0 Date : Jan 9, 1989 Text: Barring unforeseen economic developments _ such as a recession prompted by financial dislocations _ MIS professionals can expect to see slow improvement in the career picture during 1989. ``We expect 1989 to be up a little bit, but not too much,'' says Herb Halbrecht, president of Halbrecht Associates, Inc., a Stamford, Conn., executive search firm. Despite corporate budget tightening, firms will continue to fund MIS as long as top management remains convinced about its value, Halbrecht says. Since the 1987 stock market crash, numerous companies, particularly in financial services, have cut back development to save money, according to Roger O'Connor, a staff consultant at Edward Perlin Associates, Inc., a New York-based compensation consulting firm. O'Connor says he expects cost containment to still dominate in 1989 but has noticed a thaw in the development freeze and opening up of hiring. ``It'll probably continue to loosen up in the first quarter,'' he says. An executive at Salomon Brothers, Inc. says the brokerage and investment banking firm is hiring at a slow, steady pace. ``But I don't think we'll ever see the incredible hiring of two or three years ago,'' he notes. Government boom Perhaps ironically, one of the brighter hiring spots may be the government sector. Despite intensifying concern about the federal budget deficit and some financial worries at the state and local levels, computer professionals say they see signs of a robust employment picture for 4MIS people in government. ``One way government agencies tighten up is to automate to boost productivity,'' says Fran Abernathy, president of Abernathy Business Consultants, Inc. in Gaithersburg, Md., which specializes in computer consulting for government. Government computer procurement contracts, such as the recent U.S. Air Force order for minicomputers, show that several large government agencies are actively automating, according to Abernathy. She says that she has also noticed increased work with state and local government. When consultants are busy, increased MIS employment often follows as employees are hired to implement and operate new systems, Abernathy adds. ``We usually advise on what skills are going to be needed in-house,'' she notes. Consultants also are busy in the corporate sector, especially in areas in which the labor market is so tight that companies have trouble attracting skilled employees. ``They are turning to consulting firms, and the consulting firms are hiring in droves,'' says Ellen Saunders, director of sales at Computerpeople Consulting Services in Worthington, Ohio. Throughout the Midwest, a growing service sector and a resuscitated manufacturing sector appear to be keeping demand for MIS professionals high, Saunders says. On the East Coast, there is strong demand for programmers and programmer/analysts with one to four years of experience, reports Janice Serinese, marketing director of Automated Concepts, Inc. in New York, another consulting firm. Some openings are created as data centers move out of New York to the suburbs, albeit with a corresponding number of displaced workers. There is a definite falloff in demand at the top of corporate MIS pyramids. ``There is a consolidation going on at the senior level because of all the mergers and acquisitions,'' says Victor Janulaitis, president of Positive Support Review, Inc., a consulting firm in Los Angeles. The corporate restructurings of the past several years show no sign of abating. MIS is sure to suffer in the long run whenever there is a merger or acquisition, Janulaitis says. ``MIS is often a key component in a merger. Basically, you don't need two central MIS operations,'' he says. But despite the consolidations, on the West Coast, Janulaitis says, ``there is a dearth of good talent in the $50,000 to $75,000 salary range.'' These people are the project leaders still three to five years from senior-level positions. In the Southwest, the economy remains sluggish, reports John MacDonald, a partner in the Dallas office of Big Eight accounting and consulting firm Arthur Young. The weakness in the oil industry, which has spilled over to banking, finance and real estate, has cut demand for MIS professionals. ``People don't have high expectations for 1989. A lot of people thought a turnaround would come earlier, and it didn't,'' MacDonald says. Skill set The MIS skill areas expected to be most in demand in 1989 are relational databases, telecommunications and networking. Expertise in Unix and personal computer systems, including PC management, will also show strong growth in 1989, observers say. Recruiters and others following the MIS job market expect salaries to remain steady, with small, incremental increases in the neighborhood of 4%, although on the West Coast, Janulaitis says he expects salary increases to average about 7%. Recruiters report a significant reduction in job hopping recently, a development typically associated with a slack pace of hiring. The caution is reinforced by the possibility of unexpected takeovers and restructurings, according to Edward Perlin's O'Conner. ``More people are staying put,'' he says. By Alan Radding, Special to CW; Radding is a Boston-based author specializing in business and technology. <<<>>> Title : GE and GM subsidiary to s Author : CW Staff Source : CW Comm FileName: yrendbit Date : Jan 9, 1989 Text: GE Information Services will be working with General Motors Corp. subsidiary Electronic Data Systems Corp. to set up electronic data interchange (EDI) links between General Motors and its suppliers in Europe. Documents such as invoices and delivery schedules will be exchanged between GM and its suppliers by means of GE Information Systems' EDI Express software and services. The GE subsidiary will also provide EDI consulting to EDS. The EDI network will facilitate GM's move to just-in-time manufacturing in Europe, as well as eliminate paperwork, postal delays and human input errors, the companies said. Telematics International, Inc. has announced a long-term agreement to supply its Net 25 family of packet-switching systems to NCR Comten, Inc., a wholly owned subsidiary of NCR Corp. NCR plans to market the products worldwide and to use them in its internal data network, NCR Worldwide Internal Network. Tellabs, Inc.'s network management system will form the basis of user configuration and management features within Bellsouth Corp.'s Flexserv family of services, the two organizations said. Tellabs' Telemark system will reportedly allow users to consolidate or reroute dedicated digital circuits within Bellsouth's operating companies' T1 connections. Digital Communications Associates, Inc. has begun shipping FT/Express, a software product said to provide file transfer at rates up to 12 times faster than the vendor's existing Irmalink product. The software runs with DCA's Irma family of micro-to-mainframe boards and is available in either an IBM CMS or TSO version for a fee of $9,000 per CPU, the vendor said. The IEEE 802.3 Committee has established a study group to investigate standards that can be applied to network hub devices such as star hubs, multiport repeaters and wiring concentrators. Co-hosted by vendors Interlan, Inc. and BICC Data Networks, Inc., the group will hold its first meeting in Orlando, Fla., February 1-3. <<<>>> Title : Second annual Dubious Dis Author : CW Staff Source : CW Comm FileName: dubi2 Date : Jan 9, 1989 Text: If history teaches us anything, it is to expect the unexpected _ just ask President-elect George Bush. On his way to the Oval Office, Bush faced off with such unlikely foes as the governor of Massachusetts and a fundamentalist minister. He has named the governor of New Hampshire to run the White House. He met with Mikhail Gorbachev right after the Soviet leader announced unilateral cutbacks in his armed forces. The man who called Ronald Reagan's policies ``voodoo economics'' is now preserving Reagan's economic reforms. Despite such unexpected scenarios, which have turned conventional wisdom on its head this year, we continue to expect reality to conform with our plans. The MIS community and vendors are no exceptions. In an effort to once again show how ridiculous real life can be, here are Computerworld's Dubious Distinction Awards for 1988. If it ain't broke, don't fix it! They were the first to get it right, and when they got it wrong, they did it in a big way. ``They'' are American Airlines. The ``on-time machine'' added a software enhancement to its Sabre system that prevented it from doing what it does best: selling lots of airplane seats. Sabre's ``yield-management system,'' enhanced in June, failed to show the correct number of discounted fares available for reservations last summer. The flaw cost the company an estimated $50 million. The software revision that wasn't. Few software revisions have been announced and not delivered as frequently as 1-2-3 Release 3.0. The most recent nondelivery of the most popular spreadsheet package ever was accompanied by a drop in Lotus' stock price. Meanwhile, top dog Jim Manzi was revealed to be the second highest paid executive in the U.S. Part of that pay came from a quick sale of Lotus stock by highly placed company executives right before they announced poor earnings. The SEC checked into the situation but so far has uncovered no wrongdoing. If you'll just excuse me while I retire for a minute . . . Founder and Chairman John J. Cullinane ousted his hand-picked successor and resumed the chairmanship of Cullinet Software after a seven-month retirement. It looks great on paper. Supercomputer wanna-be Evans and Sutherland Computer's President David Evans said in April there was a chance that the company's high-powered processor may not work out. It seems the promising system had been simulated, but not built _ even though the company was predicting shipments by now. The glitch stemmed from the inability of chip companies to manufacture the custom silicon simulated by the firm. My kingdom for applications! After promising the delivery of OS/2 applications in the first quarter of 1988, a group of overly ambitious vendors backed down on their commitments, leaving more than 20,000 OS/2 buyers stranded. Rookie goes back to the bench. In July, DEC unveiled Debit/Credit benchmark results showing that its systems outperformed those of IBM and Tandem. At that time, DEC said it would release a full report on the benchmarks in October; it subsequently decided to rerun the tests under an independent auditor before releasing them. Users are still waiting. What Congress legislates, the IRS disintegrates. A well-intentioned congressional effort to clear up the controversy over when firms can hire technical service workers as independent contractors was turned into chaos by the IRS. A paragraph in the Tax Reform Act of 1986, Section 1706, has brought grief to many a contractor. Congress blindly passed the law without knowing how it would affect independent workers. The IRS further stirred debate with ``private letter rulings'' on individual cases. Clarity has yet to come to this situation, and the government is ``studying'' it further. Dealing from more than one DEC. DEC's Database Storage Group signed an ``exclusive'' agreement with Relational Technology under which DEC would resell Relational's Ingres database tools. However, the arrangement did not exclude DEC from also signing a CMP agreement with Cullinet covering Cullinet's tools for RDB. `It's my server!' `No, it's my server!' Ashton-Tate apparently neglected to inform strategic partner Microsoft about an agreement that was intended to allow Novell to distribute a version of the Ashton-Tate/Microsoft/Sybase SQL Server tied into Novell's Netware. Microsoft Chairman Bill Gates expressed his profound displeasure at an industry gathering shortly before a press conference that had been scheduled as the centerpiece of Novell's Networld show, causing the press conference _ and the deal _ to be canceled. Still crazy after all these clones. In John Sculley's autobiography, Odyssey, the Apple chief wrote how the firm would be crazy to sue other firms over graphical user interface technology. In March, Apple did just that against Microsoft and Hewlett-Packard. Either I buy you or I sue you. The choice is yours. What's a Dbase cloner to do? Ashton-Tate first tried to buy _ and then filed suit against _ Fox Software for alleged copyright infringement. At the same time, the company rewarded other Dbase cloners _ it bought a mini, RAM-resident Dbase clone from Apex Software and acquired its SQL technology from Wordtech. Would-be cloners are confused. Should they plan to buy a Ferrari with the proceeds of a nice sale to Ashton-Tate or should they save up for legal expenses? One too many strategic matchups? At the announcement of DEC and Ashton-Tate's joint development agreement, Ken Olsen introduced Ashton-Tate's Ed Esber as ``Ed Esbie.'' Too much of a good thing? Creator of Comdex and Interface, The Interface Group figured there was gold to be gathered after the demise of the once-humongous National Computer Conference. The group launched the MIS-oriented World Congress on Computing but killed it months after its debut. The typical audience of two dozen low-level MIS workers and foreign students at WCC sessions was far below the several hundred high-level attendees expected. Coulda seen it coming. Despite some observers' comments about the inevitability of the outcome, too many companies chased too few customers in the once-hot minisupercomputer market, which lead to a painful shake-out. Problems ranging from slower growth to losses to layoffs reigned at Alliant, Elxsi, Multiflow and a host of other organizations; only Convex came through fairly unscathed. Perhaps most telling was the withdrawal of traditional minicomputer power Prime from its equity stake in minisuper start-up Cydrome. Perestroika for profit. The bad news for Saxpy Computer was that one of its employees was charged with giving supercomputer technology secrets to the Soviets. The good news was that the publicity generated by the arrest gave the Silicon Valley start-up more instant name recognition than Regis McKenna could have produced in a year. Do as I say, not as I do. IBM cut itself a healthy slice of humble pie in mid-September when it tacitly conceded to have underestimated the number of users who have no burning desire to scrap DOS and follow Big Blue's parade over to the Micro Channel Architecture. Well, what do you know _ we're No. 1 again! DBMS and hardware vendors somehow exhibited an uncanny ability this year to benchmark their products against each other _ and all come out on top. Oracle, DEC, IBM, Cullinet and Tandem all scored the highest marks with their own versions of Debit/Credit. Bet these folks play a mean game of solitaire. <<<>>> Title : Clearing the obstacles to Author : Elisabeth Horwit Source : CW Comm FileName: cimcast Date : Jan 9, 1989 Text: Computer-integrated manufacturing (CIM) is finally escaping from vendor brochures and ``factory-of-the-future'' prototypes into the business mainstream. This trend, which began last year and should gain momentum in this one, derives from two related developments. One is the emergence of communications standards and computing platforms that make it much more feasible for manufacturing companies to coordinate information flow among islands of automation. The other is the increasing number of users who have shed their skepticism and become believers in the feasibility and ultimate payoffs of CIM. Users' increasingly positive attitude toward this technology was reflected in an Autofact '88 attendee's definition of the discipline: ``Cost-cutting on units, doing away with human error, cutting setup and lead time and cutting inventory.'' Half-empty glass Two or three years ago, many manufacturing information systems managers focused on the difficulties rather than potential payoffs of CIM _ chiefly, the task of linking together a multivendor melange of factory equipment, hosts and controllers. This was understandable because, at that time, the job was a formidable one that daunted all but a few giants such as General Motors Corp. and Boeing Computer Services. Few tools existed for integrating manufacturers' multivendor equipment and software installations, which meant that the communications foundation for CIM had to be developed from scratch _ and at great expense. According to Darryl Cain, a supervisor at Deere & Co.'s Harvester Works, CIM implementation is still ``a long road to uncertain goals, and the question management asks is, `Will CIM ever be complete?' '' But now, more and more companies _ including Deere _ have made CIM an intrinsic part of their competitive strategies. And they are much more confident that the implementation can be done at reasonable cost. A major reason for that confidence is the maturing of communications standards such as Manufacturing Automation Protocol (MAP). Until mid-1988, industry players still seemed to assume that manufacturers needed to settle on just one set of protocols. Debate on the comparative merits of Ethernet and MAP's Token-Bus protocol on the factory floor was stormy, with Digital Equipment Corp.'s combative chief Ken Olsen keeping the pot boiling with his frequent MAP attacks. Another fierce argument raged over the question of whether, or even when, users should migrate from their existing Transmission Control Protocol/Internet Protocol (TCP/IP) installations to MAP. This year, with the long-awaited arrival of Version 3.0 last spring, MAP has achieved some degree of stability and vendor support. However, it is unlikely to be adopted universally because many users solved their multivendor connectivity problems years ago with older protocols such as TCP/IP, Decnet and Ethernet. These users are reluctant to throw out their existing networks just to become MAP-compatible, according to a recent user survey by Advanced Manufacturing Research, Inc. (AMR), a Cambridge, Mass.-based research firm. But now, most major vendors _ DEC included _ are promising support for MAP, TCP/IP and Open Systems Interconnect pretty much as a matter of course, and the choice of standard is becoming ``somewhat of a nonissue'' for CIM implementors, according to AMR President Anthony Friscia. Many users have discovered that they can mix and match various networking protocols. As Tim Dirr, a computer-aided design (CAD) applications engineer at 3M Co., put it, ``It would be great to have just one networking standard, but I'll settle for two or three.'' But this level of integration is the easy part of CIM. As the new year comes in, the main concern of many implementors is how to develop applications to manage information flow between different work areas across various hardware and software systems. This is necessary so that, for example, design change orders go to the business host for approval, out over an electronic data interchange link to suppliers and down to the cell controllers for implementation into the manufacturing process. Vendors are just starting to come up with CIM solutions on this higher, more complex level. First, they are aggressively entering alliances with other hardware vendors, particularly controller vendors, according to AMR Vice-President Bruce Richardson. ``The goal is not cell control but cell integration,'' he says. ``If we're building vitamins or cars, we don't want to walk down to the factory floor with the ordering slip.'' Mainframe scheduling systems need to be linked to area controllers that coordinate production on the factory floor ``so no order is done until the day it's due,'' Richardson says. Last fall, DEC and Allen-Bradley Co. announced their Pyramid Integrator. The device comprises an Allen-Bradley cell controller, which provides links to shop floor devices, and a DEC Microvax processor module, which provides integration with plant and area management applications that typically run on DEC systems. Another recent alliance for integration was formed between Motorola Computer X, Inc. _ a Motorola, Inc. subsidiary _ and Stratus Computer, Inc. Computer X offers a hardware platform for coordinating the different cells on a factory floor, each of which can be working on a different part of a product or process. Keeping track Stratus sells its fault-tolerant hosts as area controllers that can coordinate activities across the plant; it also provides links to design and engineering workstations and IBM hosts that often run administrative and scheduling applications such as manufacturing resource planning. Computer vendors are also reaching out to the CAD/computer-aided manufacturing side by supporting manufacturing's dominant communications protocols. Both IBM and DEC, for example, have announced support for Sun Microsystems, Inc.'s Network File System and Apollo Computer, Inc.'s Network Computing System. Apollo and Sun have reciprocated by supporting Decnet and Systems Network Architecture. As a result, Sun engineering workstations can send design documents directly to IBM scheduling and bill of materials systems. But customers want more than just a physical link in order for graphics documents to be sent to the shop floor in a useful form, according to Donald Bell-Irving, a manager of DEC's CIM applications marketing group. ``You need low-cost graphics devices on the shop floor and a fast LAN; but even more, you need a compound document technology'' that allows users to append routing slips, comments and changes to a design as it moves from one manufacturing area to the next, he explains. ``That technology is just becoming feasible.'' The big computer companies' other major marketing strategy is the CIM software platform _ a common user interface, communications and database environment that runs on their favorite hardware platforms (see story at right). They are also working with software companies to produce applications and development tools for their platforms. The software platforms address what Anthony Klemmer, a vice-president at ITP Boston Inc., a Cambridge, Mass., consulting company, refers to as a ``technology backlash'' against customized CIM applications that were too costly to be practical. ``If you can integrate and automate in pieces a platform that allows you to reuse software, standardize additions and thus lower costs, it is well worth it,'' he says. Several users, however, say that this latest wave of ``open'' software interfaces is too recent for them to judge its usefulness to their CIM plans. Some worry that if they choose one vendor's software platform, they will not be able to install those devices that do not support it. What to do The solution to this problem is for the host vendors to migrate their software platforms to industry standards such as MAP, according to James Caie, director of manufacturing systems at GM. Caie was in charge of CIM at the famous Saginaw plant. This migration would be a benefit to CIM implementors, since the platforms would provide something that MAP currently lacks: a standardized foundation for application development. Rather than wait for vendors to come out with more complete CIM standards and better development tools, many manufacturers are going ahead with CIM implementations _ with development plans carefully geared to take advantage of what is out there now, both on vendors' shelves and within their own installations. Savvy IS managers are also tying their CIM plans to specific business goals, rather than taking a wholesale approach to the technology. For example, rocketing interest rates and a depressed economy in the early 1980s drove Deere to find ways to cut inventory and production costs via CIM, according to David Scott, manager of marketing for Deere Tech Services. ``When interest rates are 22.5%, inventory starts eating up a lot of cash.'' The manufacturer implemented a top-down plan with a bottom-up implementation that ``made use of existing equipment and didn't go overboard putting technology in for technology's sake,'' Scott emphasized. The payback: Deere just announced record earnings for 1988, and its factories can continue to turn a profit even running at 30% to 35% capacity. Deere was among the pioneers, but a number of other companies are turning to CIM. Barring catastrophes in the manufacturing sector and the U.S. economy in general, we should see a snowball effect in the CIM world in the next few years. As the new wave of CIM implementors report the paybacks they have gleaned, yet more companies will be encouraged to start their own installations. This, in turn, will cause vendors to accelerate their efforts to provide better products and tools to gain greater shares of the burgeoning CIM market. While the term ``computer-integrated manufacturing'' was coined only a few years ago, the networking, software and computing technologies that serve as CIM's foundation have been developing for decades. This should reassure IS managers who do not like to bet their budgets on untested products. While the leading edge of CIM technology may be vaporware, there is now a solid body of available products to get implementors going. It's time to take the plunge. By Elisabeth Horwitt; Horwitt is a Computerworld senior editor, networking. <<<>>> Title : Flopticals: New PC backup Author : CW Staff Source : CW Comm FileName: insit1 Date : Jan 9, 1989 Text: SANTA CLARA, Calif. _ A flexible disk drive with a sprinkling of optical technology may one day replace tape drives as the backup method of choice for personal computers. With its relatively low expected retail price, the drive may eventually provide some PCs that do not have backup with an extra measure of protection. Insite Peripherals, a start-up company located here, has created what Insite President and co-founder Jim Adkisson calls a ``floptical'' disk drive. To achieve higher track density, Insite inscribes a standard 3 -in. flexible disk with an optical servo track _ a strip of optically recorded data. The closed-loop optical servo guides the drive's read/write head so that it follows the magnetic data tracks more precisely than it does with standard floppy disks. The drive's read/write carriage is equipped with an LED and magnetic recording head. The LED reads the track while the magnetic head reads the data. As a result, more data can be squeezed onto the Insite drive than on standard floppies, Adkisson said. The Insite Model 1325 floptical disk drive boasts a track density of 1,250 tracks per inch, compared with 48 and 135 tracks per inch for traditional floppy drives. More tracks, more bitesWhile a standard high-density 3 -in. floppy drive offers at best 1.44M bytes of storage capacity, the Insite drive offers 20.8M bytes, he added. It may be quite some time before users see any Insite drives installed in a PC. Production will reportedly begin in the spring, and PC vendors will most likely wait before gambling on using such a product. The most likely source will be computer resellers. The drive should sell to end users for less than $500, Adkisson said. A 40M-byte data cartridge drive sells for a price of about $300. However, Insite is still ``cleaning up some technical issues,'' Adkisson noted. He declined to offer specifics. Production is scheduled to begin in April, and the drive will be sold to OEMs and value-added resellers. Although the technology may represent a breakthrough, it will be relatively inexpensive because it is based on standard media, Adkisson said. The drives will be sold to OEMs at a price of $250 in quantities of more than 5,000. Adkisson said he expects initial orders to come from distributors and resellers in the add-in business. System vendors require that there be multiple sources of a product before committing to implement it, he maintained. Insite is offering to license the technology to other drive makers. Adkisson said Insite is in serious negotiations with a number of parties. However, he declined to name them. Eastman Kodak Co. subsidiary Verbatim Corp. as well as Xidex Corp. have signed on to produce media, an important factor if other peripherals makers are to consider manufacturing the Insite drive. Robert Katzive, a vice-president at market research firm Disk/Trend, Inc., said the new drive's sluggish performance will relegate it to a backup role. However, users may prefer Insite to a tape drive because it integrates well with smaller systems. Like traditional drives, the disk does not require specialized software to manipulate files. ``Most people prefer disk technology because it's faster and much more familiar to them,'' he said. ``With a tape, you have to run it all the way to the end to get to a certain file, and you need specialized software.'' Insite's floptical disk drive offers an average access time of 65 msec, far slower than many rigid drives. The Insite drive will fill a gap, offering backup for the smaller 3 -in. rigid drives, Katzive noted. By Julie Pitta, CW staff <<<>>> Title : Budget process a toss-up Author : CW Staff Source : CW Comm FileName: misbudge Date : Jan 9, 1989 Text: Funny thing about the future. It's never how you expected it to look once it arrives. Nevertheless, MIS managers face the task of gazing into their crystal balls as they shore up their budget defenses for the next few years. The managers must consider an increasing number of external forces capable of generating economic turmoil. Some managers are adding to their budgets while others are trimming, depending on top managements' perception of the vitality of the specific industry. Still others, like Jeffery Alperin, assistant vice-president for corporate technology planning at Aetna Life and Casualty Co., contend that such strategies are reactive and therefore viewed at Aetna as imprudent business practices. The trend at Aetna, Alperin says, is to cut costs without regard to forecasts. The early portion of next year promises smooth sailing, according to analysts, as the economic honeymoon traditionally enjoyed by new presidential eras is expected for the incoming George Bush administration. Unfortunately, things begin to look a little cloudy after that. The most significant wild card promises to be international trade. ``Any position taken by the U.S. that is perceived as protectionist _ whether against Japan or Europe _ will be a very serious problem for the industry,'' says David H. Mason, vice-president of Boston-based Northeast Consulting Resources, Inc. Protectionist moves by Congress could result in a foreign-vendor backlash against U.S. companies that do a large overseas business. The result is higher expenses that are, in turn, handed down to the end user. Although Japan is viewed as this country's chief foreign economic threat, Europe will take on increasing importance as the future approaches. By 1992, the U.S. could be facing nothing less than the economic unification of Europe. Essentially, there is a movement that would unite all European countries into a single financial market. Tariffs would be nonexistent, and a single European passport would prevail. This unity could give European vendors an advantage in price wars with U.S. companies. ``The goal is to let a Siemens or an Olivetti become real world-scale competitors because they'll have the strength of a single European market to build on,'' Mason says. ``The argument is that Siemens can't compete with an IBM because West Germany is too small a market. But if you unify Europe, you can create a handful of very big European competitors.'' The Japanese still cannot be counted out either and are increasingly challenging U.S. businesses in core industries. The Council On Competiveness reports that by 1986 the Japanese had captured 65% of the world's semiconductor market, while the U.S. had less than 30%. The council also sees storm clouds on the horizon: Within the nondefense category, the U.S. spends the smallest percentage on research and development directly related to industrial growth of all the major industrialized nations. Perennial problems like the budget deficit also continue to nip at the heels of the U.S. economy. ``Just how willing are foreign investors going to be to continue to pump money into our economy while we continue to run deficits?'' asked Jack Biddle, president of the Washington-based Computer and Communications Industry Association. Some analysts say these factors could put the U.S. economy in deep trouble. ``Ultimately, whoever is the next president will have to deal with a recession,'' Mason maintains. ``In some industries, we're seeing 40% of capital spending going toward information systems. I find it hard to believe that rate is sustainable and at some point that it has to back off. A recession could do that very quickly.'' Tightening the belt Such monetary uncertainties will require MIS managers to keep a tight ship, financially. A recent white paper from The Yankee Group in Boston claiming that the current rate of technological change could lead to a 100% turnover means the average system could have a life span of just 18 months by 1990. The streamlining that comes of this is also going to result in managerial changes, including a tighter connection between the chief MIS officer and the company chairman and a rethinking of the interaction between man and machine. ``We're competing for a shrinking number of qualified employees,'' Biddle says. ``They don't want to do the dirty work anymore, and automation is the answer. But not automation in the past sense, in terms of simply replacing accountants and billing clerks, but using automation as a technique to tighten up the entire organization.'' Fights over intellectual property rights also promise to shake up the MIS office. ``The fact that you can secure rights to the look and feel of a program for the length of a copyright _ which is a hell of a long time _ can get the whole of society into trouble,'' says Martin Ernst, an associate at the Center For Information Policy Research at Harvard University in Cambridge, Mass. The factors that shape today's economic conditions doubtless influence information service budgets in numerous ways. Neverthless, almost all large data centers are beset with what may seem _ but are not necessarily _ two opposing objectives: the need to mount costly strategic projects that will keep a corporation ahead of the competition and adherence to tight-fisted mandates on spending. Aetna believes that a tighter budget leaves more dollars for strategic platforms. Alperin's department, for example, will function with a budget trimmed by 5% in 1989. ``We will do more work than last year,'' he concedes, ``but we are much more productive and are finding ways to do even more with less.'' Unlike many companies that are just now facing the fact that the days of 10% MIS budget increases are over, Aetna's Alperin says, ``as a whole we have been holding expenses pretty flat over the years and reserving any increases for very specific strategic areas.'' In contrast, economic forecasts for 1989 coupled with the sting still felt from last year's stock market fiasco have flattened the MIS budget at Brunswick Corp. in Skokie, Ill., a large manufacturer of leisure products, according to MIS director Brian Ellis. The watchword at the company is caution _ and that caution spreads to strategic planning methodology, old and new applications, innovations and personnel. The company now has a formal three-year strategic planning curve _ the direct result of the October 1987 crash and economic forecasts, Ellis says. The current MIS budget, which has been cut by an unspecified amount, now spans from 1989 through 1991. Information services will slow down innovations, concentrating instead on stabilizing existing services to make sure they meet customer needs, with additional efforts to diminish customer complaints. In addition, MIS has felt the impact by having to do without some state-of-the-art equipment, Ellis says. Ellis says the strategy will help the organization remain profitable if the U.S. economy remains stable. The reduced budget will also likely spare the company from having to cut back in the face of economic downturn. Times may be tight at Brunswick, but at American Industries, Inc., data processing manager Larry Potter says management ``anticipates a rosy 1989, and its budgeting process will reflect it.'' The 1989 budget will post about a 5% increase, he says, slightly higher than last year's increase, reversing a downward trend evident since 1982. Budgeting within the organization follows steel industry trends and looks over the shoulder of its subsidiary, American Steel Division, the benefactor of most of the MIS services. According to Potter, the steel industry does not necessarily follow the rest of the economy. ``Although management projects a cool national economy for 1989,'' Potter says, ``it anticipates a good year in steel, helped by both foreign importers and the voluntary trade agreement.'' American Industries plans for two years and budgets for the coming year. Those budgets, reviewed quarterly and matched against quarterly forecasts and division results, enable MIS to react quickly, he says. With a good year projected by management, MIS will likely add one staff member and concentrate on a growing application load and on meeting increasing user demands. In addition, Potter hopes to complete a few projects that have been on the drawing board. If, however, the economies of the steel industry turn sour, Potter will first cut into staff perqs and conference attendance. The second round would trim services and the third, equipment expenses. ``We can judge when to react by measuring the cycles of our business against anticipations for the coming two quarters,'' he explains. Like American Industries, Scott Paper Co. would battle a disastrous economic year by trimming conference and training expenditures. But MIS director Ronald Renk does not expect to have to resort to anything of the sort. Renk says that in 1989, Scott MIS will implement a few strategic projects. Costs for these will represent one increment to the increased budget, which is based on current performance. The other increment, he says, will account for inflation. At Express Freight Line, Inc., data processing manager David Schmalz has yet to post his 1989 budget, but he anticipates that MIS, like other departments, will have to cutback. Schmalz does not anticipate a recession in 1989 but says the tight budget is a reality in an industry fraught with competition. ``Freight discounts are way out of hand _ between 50% and 60%,'' he says. ``The only way to make it up is to cut expenses. It comes down to MIS and other departments.'' As a result, he says, development projects ``have been slowed, but not stopped, because of staffing shortages and a shortage of tools to speed the development.'' At Radnor Corp., a wholly owned commercial real estate subsidiary of Sun Corp. that generates about $30 million annually for the parent company, the MIS budget will increase three times in 1989 _ from a meager $100,000 to approximately $300,000. The budget, according to Alan Alesius, manager of data processing, will increase throughout 1990 but not anywhere near the expected 1989 increase. Economic forecasts at the company are keenly scrutinized because of the close tie between economic conditions and the purchase of homes as well as business real estate. ``If the current expansion continues, Radnor should experience a year of decent _ but not explosive _ growth,'' Alesius says. ``If there is an economic downturn and interest rates go up, people will sit tight. It will be a tougher time.'' But for the MIS manager, ironically enough, a bad year could be a blessing. Recent companywide growth has diverted the attention of key personnel, leaving little time to attend to standardization and policy issues. ``Our projects portfolio has grown so large that few people have had time, for example, to contribute to getting remote locations standardized,'' Alesius explains. ``1988 was a good year,'' he adds. ``We are looking to add value to the corporation in 1989. It is a good opportunity for MIS.'' At Echlin, Inc., a completely decentralized company, some divisions are posting budgets that are over last year's by between 5% and 15%, while others are staying flat, according to Richard Hock, MIS director at corporate headquarters. ``We're running like we have always run in the past _ carefully. But we have not attempted to curb expenses,'' he says. While some divisions have added to their budgets, the MIS budget at headquarters, which Hock oversees, has been dropping for the last seven years. This year, Hock has trimmed the budget by $350,000 _ the result of a recent conversion from a mainframe- to a micro-based platform. Hock noted that his practices are not indicative of the other units, although some are leaning to minicomputers and microcomputers. Hock said he does not anticipate a ``deadly economic year'' but that if he were forced to trim expenses, he would first freeze capital expenditures and then trim staff. But he said he does not expect to have to resort to such measures. Although Echlin is not recession-proof, he points out, people who do not replace cars do more repair work. ``Our 1989 runs started in August, so we are [now] in 1989 and having a good year,'' he says. By James Daly and Robert Moran; Daly is a Computerworld staff writer. Moran is Computerworld's Mid-Atlantic correspondent. <<<>>> Title : Computers are changing th Author : CW Staff Source : CW Comm FileName: terryibm Date : Jan 9, 1989 Text: Computers are changing the way we work for both individuals and organizations. By providing timely access to relevent data, computers allow us to spend less time researching and verifying information and more time getting things done. This makes for faster, more informed decision making and it improves our effectiveness and productivity. Today's computer systems speed memos, documents, images and graphs to individuals throughout an organization. This kind of direct people-to-people communication results in more focused and efficient dissemination of information. New technologies are bringing more capability to computers, allowing users to manipulate, understand and analyze data in ways they never could before. With access to company databases, decision support systems, expert systems and other technologies, information workers ranging from engineers to scientists to loan officers can work even more effectively. Computer imaging is another dramatic example of how new information technologies affect work. By reducing paper-based customer information, businesses can realize millions of dollars in savings and significant improvements in customer service. As organizations become more competitive, the management of information as a strategic resource becomes paramount. A major challenge in the '90s will be the development of software to support the computers that help people to better manage the flow of information. With advanced graphics, touch screens, voice and image recognition running on high-performance hardware and software, computers will continue to make ``change'' the watchword of the work environment. <<<>>> Title : Calling the shots for '89 Author : Mark Breibart Source : CW Comm FileName: mbside2 Date : Jan 9, 1989 Text: It will take a hardy MIS manager to weather the business and technology trends of 1989. Here's a little help from leading forecasters on what to expect in key areas: Management Strategies ``MIS executives have to get their house in order. For example, they have to get rid of the multiplicity of languages and architectures so they can move more quickly on new technologies. And they have to institute more realistic chargeback systems. They have to hire people with business savvy into MIS positions, even if [these people] are not technicians. If they don't, we're going to blunder into the '90s with the wrong people in the jobs.'' Ted Freiser, president John Diebold Group, New York ``Companies in the '90s will have to understand how to integrate technology into the business. They have to stop viewing MIS as a cost and start utilizing it to make their companies more competitive.'' Jim Hall, principal Index Group, Cambridge, Mass. Large Systems Companies will be changing to or considering a two-tier environment rather than a three-tier strategy over the next 18 to 36 months. The architecture will have a layer of micros talking to host IBM 3090-class machines, with servers in the middle based on machines using the 80386 and 80486 chips. DEC won't be able to compete because it has no viable high-end or PC solution. Robert Tasker, vice-president International Data Corp. (IDC), Framingham, Mass. ``The on-line processing systems of the '80s will give way to ad hoc computing in the '90s, with a lot of canned intelligence in the low-end systems.'' MIS directors will have to get on top of the trend to give the user more control, instead of fighting it. Curt Beaumont, director Systems and Peripherals Technology Service, IDC Mid-range Systems Next year is not going to be pretty. Particularly at the low end, PC LANs _ not minis _ will be the preferred choice. That approach will also hold true for larger systems when SQL server software becomes available for distributed processing. DEC's low-end VAXs could soon be in trouble. Even for larger systems, users will eventually throw out their VMS applications and go to third-party programs on the PC. The VAX will become a network server, but only if DEC moves to a standard operating system like Unix and if the VAX is optimized as a server instead of as a time-sharing machine. ``For IBM shops, the focus of SAA is wrong. What's being touted is the ability to push applications onto bigger machines. But people want to move applications to smaller machines, down to their control.'' John McCarthy, director of professional systems research Forrester Research, Inc., Cambridge, Mass. DEC will try to preserve its proprietary systems. That should be easy for them, at least in the short run. Too many people are happy with VMS. IBM's AS/400 shows there is still a lot of growth for multiuser systems, some of it as network servers in large firms, but much of it in the small-business market. Other minicomputer companies will either emphasize nonproprietary systems like Unix or focus on niche markets like Wang with image processing or Prime with CAD/CAM. Brian Daly, senior associate editor Datapro Research Corp., Delran, N.J. Communications ``Growth in the LAN market will continue to be impressive, with token-ring networks increasing even faster than Ethernet ones. ``There will be a gigantic leap in the use of high-performance and superperformance networks with speeds from 100M bit/sec. on up.'' Brad Baldwin, industry analyst Dataquest, Inc., San Jose, Calif. Software ``There will be more talk than reality about distributed database systems. The technology is here, but users won't be ready for it until they change their view of the proper architecture. For example, the classical database notion is to reduce data redundancy. But as the rules of end-user access have shifted, and as we have LANs with 450 users on them, we cannot have one database that serves all purposes.'' William Inmon, senior principal American Management Systems, Inc. Lakewood, Colo. Nineteen eighty-nine will be the year of the distributed database. The software will center around SQL and PCs based on the 80386 chip. If these products deliver as much as promised _ and it's a big ``if'' _ they will provide an alternative to minis and mainframes. Standards will be a bigger topic than they have been at any other time this decade. George Schussel, president Digital Consulting, Inc., Andover, Mass. By Mark Breibart; Breibart is a Flugelman Intern working with Computerworld Focus on Integration. <<<>>> Title : Sometimes Washington seem Author : Mitch Betts Source : CW Comm FileName: govt2 Date : Jan 9, 1989 Text: Sometimes Washington seems a little like a TV game show _ the categories stay the same, but the players keep changing. In the coming year, the categories for the federal policy game will sound quite familiar: taxes, trade and telecommunications, the three Ts. But now the computer community gets to deal with a new set of players, namely the incoming administration of President-elect George Bush. ``Apparently it is going to be less of a policy transition than a personnel transition,'' says Olga Grkavac, senior vice-president of government relations at ADAPSO, the computer software and services industry association. For example, the changeover may trigger the appointment of a new U.S. trade representative, a new assistant attorney general for antitrust, a new secretary of commerce and new members of the Federal Communications Commission. The Bush team is likely to be a mix of holdovers and new appointees, all with strong Republican credentials, Grkavac says, although the infusion of new blood may bring some subtle changes in policy and emphasis. Computer industry lobbyists point out that the new political appointees must be ``educated'' about the needs of high-technology industries. For example, the regional Bell holding companies will need to develop ties with a new batch of administration officials in their high-powered effort to win freedom from business restrictions set down in the 1984 AT&T divestiture decree, which keeps them out of the long-distance telecommunications, electronic publishing and manufacturing businesses. Lose-lose situation The Bell companies would have been losers whether Bush or Michael Dukakis had won the election, because they are losing valuable allies in the outgoing Reagan administration, according to a report by George R. Dellinger, a telecommunications analyst at Washington Analysis Corp., a securities research firm in Washington, D.C. Consequently, most of the high-tech trade associations are busy sending position papers to Bush's presidential transition team to influence the movers and shakers in the forthcoming adminstration. Among the filers are the American Electronics Association, the Computer and Business Equipment Manufacturers Association (CBEMA), the Council on Competitiveness, the Computer & Communications Industry Association and the Council on Research and Technology (Coretech). Coretech and the Council on Competitveness, for example, are urging the new administration to appoint a presidential advisor on science and technology and improve the coordination of federal technology policies. W. J. Sanders III, chairman and chief executive officer of Advanced Micro Devices, Inc., a Sunnyvale, Calif.-based semiconductor company, went so far as to publish a ``Dear George'' letter as a full-page newspaper advertisement. It appears there will be no shortage of advice on how the Bush team should handle the three Ts. Taxes. One area of concern for corporate America _ including MIS managers and the computer industry _ is whether a big tax hike to close the federal government's $130 billion budget deficit is in the offing. Higher corporate taxes could mean less money to spend on technology, a point that was underscored when the computer industry howled in protest at a proposal by presidential candidate Jesse Jackson to raise corporate taxes by $20 billion. That really got the industry's attention. ``You can't say that technology is important for competitiveness and then just take all of the money you're supposed to invest,'' a CBEMA spokeswoman said at the time. However, Bush remains adamantly opposed to big tax hikes, and Congress will find a way to sidestep the issue in 1989, analysts say. Instead of a large tax increase, L. Douglas Lee, an economist at Washington Analysis Corp., predicts that Congress and the Bush administration will hammer out a modest, $25 billion budget package of spending cuts and revenue boosters. ``We believe that constructing a $25 billion deficit-reduction package for the 1990 budget is a very manageable task,'' Lee says. In the past, Congress has regularly produced similar packages composed of some spending restraint, some modest increases in excise taxes and optimistic economic assumptions. Whatever tax bill emerges, the high-tech industry will fight hard to add a permanent extension of the 20% tax credit for research and development expenditures, according to John L. Pickitt, president of CBEMA. In 1988, Congress approved a one-year extension of the R&D credit, now scheduled to expire at the end of 1989. The good news is that Bush has publicly stated his support for a permanent R&D credit; the tough part is finding a way to pay for the popular tax credit, which costs the U.S. Treasury about $705 million a year in foregone revenues. Bush also supports cutting the capital gains tax rate to 15% on investments held for at least one year _ a proposal supported by the National Venture Capital Association and the small firms in the American Electronics Association. Supporters say the tax break provides an incentive to venture capitalists to invest in new and growing high-tech firms. However, this proposal faces tough opposition from Congress. Trade. The computer industry can expect the Bush administration's trade policy to look a lot like the Reagan administration's. During the election campaign, Bush seemed satisfied with the Reagan trade record and demonstrated the same free-trade instincts. But Stephen D. Cohen, an international trade analyst at Washington Analysis Corp., notes that the Reagan-Bush team actually followed a ``pragmatic'' trade policy rather than a pure free-market policy. That is, it resisted trade barriers only when political pressures were not severe. Cohen suggests that the Bush administration will be forced to have a slightly tougher trade stance, since it will be faced with a continuing massive trade deficit and the 1988 omnibus trade law, which mandates a more aggressive government effort to reduce or eliminate foreign barriers to U.S. exports. In fact, the Bush administration may spend most of 1989 just implementing the 1988 legislation, ADAPSO's Grkavac says. According to a report by the U.S. Department of Commerce, the Omnibus Trade and Competitiveness Act of 1988 contains the following provisions of interest to U.S. high-tech firms seeking to boost exports: It gives the president authority to negotiate a multilateral trade agreement to open up foreign markets to U.S. goods and services for the next 20 years. It sharply reduces the burdens of national security export controls. The law focuses on restricting the export of truly strategic technologies to the Soviet bloc while relaxing controls on trade in the free world. It permits the U.S. Trade Representative to initiate trade complaints against countries that fail to provide adequate intellectual property protection for U.S. products. It also requires the U.S. Trade Representative to identify countries that have trade barriers affecting the telecommunications industry and requires negotiations to eliminate or reduce the barriers. If the talks fail, the president is required to take some retaliatory action. Telecommunications. Congress has already set the stage for the two big communications battles of 1989. One controversial issue is the FCC's proposal to overhaul the way it regulates AT&T's long-distance rates. The proposal would replace the current system of profit ceilings with a new scheme of so-called ``price caps,'' which place an annual ceiling on the price of long-distance services rather than regulating the company's rate of return. Key members of Congress, not convinced of the price cap plan's benefits, have already put the FCC on notice that they want the commission to proceed very slowly. Likewise, business network managers have charged that the plan contains numerous flaws that could make users worse off than they are under the existing regulatory system. For example, the International Communications Association and the Ad Hoc Telecommunications Users Committee argue that the baseline for the initial price caps is set too high and that the annual adjustment formula will not provide any substantial rate cuts. The second big issue is whether the regional Bell holding companies should be freed from the business restrictions imposed by the AT&T divestiture decree. Members of both the House and Senate signaled their intent to address this issue by sponsoring ``free the Bells'' resolutions in 1988. Though the resolutions don't have any legal effect, they send a political message. Clearly, 1989 will be a year of new players but no big surprises in Washington. Analysts conclude that the computer community can bet on three basic trends: modest advances toward deregulation of the telecommunications industry, coupled with lots of congressional hearings; a slightly more aggressive trade policy; and a renewed lobbying campaign to extend the R&D tax credit for high-technology industries. In essense, that's the kind of undramatic, incremental progress that voters said they wanted when they went to the polls in November. The business community supported Bush over Dukakis partly because it did not want any dramatic change in government policies, says Ed Zschau, chairman and CEO of Centstor Corp., a computer peripherals company in Silicon Valley. By Mitch Betts; Betts is Computerworld's Washington, D.C., correspondent. <<<>>> Title : Hot bills for '89 Author : CW Staff Source : CW Comm FileName: govtside Date : Jan 9, 1989 Text: Aside from the broad policy issues of economic competitiveness, Washington officials will be grappling with some specific legislation of interest to MIS managers: Anti-virus law. There will be pressure to amend the Computer Fraud and Abuse Act of 1986 to cover the kind of computer virus attacks that hit the Internet research network in early November. All eyes will be on the so-called Herger bill, introduced last July by U.S. Rep. Wally Herger (R-Calif.), which outlaws computer viruses and carries a penalty of up to 10 years in prison. Buck BloomBecker, director of the National Center for Computer Crime Data in Los Angeles, says the most interesting feature of the bill is that it applies to all virus crimes affecting interstate or foreign commerce _ not just federal government computers. ``I suspect there will be some debate over that jurisdictional issue,'' he says. Section 1706. This legislation concerns the controversial section of the Tax Reform Act of 1986 that requires computer consultants working for brokers to become employees unless they meet the stringent legal standards for classification as independent contractors. Congress recently ordered the U.S. Department of the Treasury to conduct a definitive study on the issues surrounding Section 1706 and to report in September 1989. The ultimate fate of Section 1706 probably hinges on the results of that study, according to Joseph E. Collins, spokesman for the Data Processing Management Association in Park Ridge, Ill. Software rental. A bill to block unauthorized rentals will be reintroduced, ``and we see that going forward,'' ADAPSO's Olga Grkavac says. Government computer security. Federal MIS managers will be busy complying with the Computer Security Act of 1987, which requires detailed security plans, training programs and the identification of sensitive systems. The security plans are due Jan. 9, a year after the act became law. Halon regulation. MIS will be watching to see what the U.S. Environmental Protection Agency does to regulate the use of halon. Last fall, the EPA called for a complete phaseout of halon _ a chemical used in data center fire-suppression equipment _ because it contributes to the depletion of the Earth's protective ozone layer. The 11th Amendment. ADAPSO expects hearings on whether the 11th Amendment exempts universities from certain laws, such as those governing software copyrights. <<<>>> Title : Why things are better Author : CW Staff Source : CW Comm FileName: sutter Date : Jan 9, 1989 Text: When information technology was first introduced, many people worried about the negative impact of computers on the nature of work. These predictions have been proven wrong. Computer-based systems continue to offer the potential to enrich the content of jobs and aid in reaching high levels of quality. Contrary to fears that were expressed in many circles, employment rose rather than declined. The redefined jobs that grew out of early automation efforts have become more essential. ; for example, greater consequences exist when failure occurs. Over the years, workstation penetration continued throughout U.S. industry. Computers have profoundly affected the way we work in engineering and in factory operations. Quality has risen steadily due in large part to interactive graphics workstations in engineering and factory automation equipment in manufacturing plants. Many specific jobs in engineering associated with checking and support services have been automated and made nearly error-free because of the built-in validations defined in software programs. Jobs on the factory floor have been transformed from repetitive, monotonous and often dangerous tasks to ones working with sophisticated systems. Computer-based systems in the offices have shortened the time to process transactions. Tasks previously performed in sequential fashion can now, with the proper databases and communications systems, be done in a more interactive and concurrent manner. More informed decisions are possible. Much has been achieved through the use of information technology. Automated features that monitor everything from spelling and grammatic construction to arithmetic calculations, design parameters and machine-tool performance have and will continue to assist us in becoming more productive in raising the quality of our work. <<<>>> Title : Think business Author : Dennis L. Duffy Source : CW Comm FileName: duffylet Date : Jan 9, 1989 Text: Your article ``Downsizing threatens MIS influence'' [CW, Nov. 28] addresses a threat only to those who suffer from a certain myopic condition that, unfortunately, has afflicted quite a few people. The implication that the influence wielded by an MIS manager is a function of the number and size of the computers he controls is frightening indeed. The role of MIS is that of implementing effective solutions to business challenges. If we do not consider all solutions to a given problem _ even those that do not involve a computer _ then we are no better than the carpenter whose only tool is a hammer. Every problem will look like a nail. MIS professionals have got to stop being back-office ``byteheads'' and start being consultants to business. If we are afraid to implement technically simpler, lower cost solutions, then we will continue to proliferate technological solutions for their own sake. Doing that will ensure that we remain in the back office as functionaries and out of the boardroom as leaders. Dennis L. Duffy Senior Vice-President Director of MIS Burson-Marsteller New York <<<>>> Title : Their side of the story Author : Paul B. Carroll Source : CW Comm FileName: carrolle Date : Jan 9, 1989 Text: Douglas Barney recently wrote a column that began, ``A while back, The Wall Street Journal kept trying to run a story about IBM, and IBM kept making them pull it'' [CW, Nov. 14]. That's ridiculous. IBM didn't ``make'' me pull the story, which said that IBM was developing a personal computer that contained an Intel 80386SX chip but used an AT bus. Nobody can force the Journal to kill a story. I pulled the story on my own halfway through a press run because an IBM spokesman told me the story was flat wrong. IBM public relations people have always been straight with me, and it seemed to be possible that the development project had been canceled after my sources got their information. The Journal also didn't ``keep trying'' to run the story. I killed the story once, during the press run for a Friday paper. I checked on Friday and found that the story was indeed accurate _ the IBM spokesman had been misled, had misunderstood or was playing a semantic game with me. So we ran the story in our next paper the following Monday. If Mr. Barney had called, I'd have been happy to explain what happened. I appreciate what Mr. Barney was trying to do to clarify confusion, but I think he wronged the Journal along the way. Paul B. Carroll Reporter The Wall Street Journal New York <<<>>> Title : Use the carrot, not the s Author : Thomas R. Weinbe Source : CW Comm FileName: weinlet Date : Jan 9, 1989 Text: The computer security world is inevitably put off by Morrisian escapades because of the threat posed to systems. It is not easy for us to look kindly on the benefits we reap from certain of these experiments, but I submit that we ought to attempt channeling the potentially valuable talents of the hacker subculture. My suggestion is that the industry establish a fund to award prizes to discoverers and documenters of computer system security weaknesses. My hope is that the combination of the prize and the resulting recognition would be sufficient reward for the intellectual achievement and deter the recipient from implementing the documented threat. I do not favor renouncing use of the stick but, as is often the case, the carrot could turn out to be an efficient first alternative. Thomas R. Weinberger Manager, Planning and Security Information Systems Division Memorial Sloan-Kettering Cancer Center New York <<<>>> Title : Get it right the first ti Author : Timothy Lister Source : CW Comm FileName: listlet Date : Jan 9, 1989 Text: I enjoyed reading Bob Stahl's In Depth article, ``The ins and outs of software testing'' [CW, Oct. 24]. I can certainly agree with his eight-point reality checklist for more efficient testing. However, I take exception to his statement: ``Software's dismal reputation is largely the result of inadequate testing.'' Software's dismal reputation is because of many factors like poor use of methods and tools and overconstrained schedules. Inadequate testing has little impact on software quality. For developers to rely on testing to improve software quality is like doctors deciding to rely on better antibiotics to cure infection after operations. Joseph Lister (no relation) first described the sterile technique when performing surgery: Don't infect the patient in the first place. In 1988, we have Harlan Mills and Victor Basili, among others, describing ``clean room'' software development methods. To improve software quality, let's concentrate on not infecting our creations in the first place, not on more efficient methods of curing illnesses we have caused. Timothy Lister Principal The Atlantic Systems Guild New York <<<>>> Title : Make them pay Author : Zamri Zaini Source : CW Comm FileName: zainilet Date : Jan 9, 1989 Text: It is no public secret that some computer users, including data processing personnel, are not willing to keep their password confidential. They do not hesitate to share it with their peers and subordinates (but not with the boss, of course). The disclosure of passwords defeats the costly security feature. But those users do not feel they need to keep their passwords secret. They think it is OK to share passwords, and it is almost impossible to change their mind. There is a technique we can try to convince people who do not want to keep their password secret: Provide the password with the facility to access their Personal Information System (PIS). The PIS shows such personal information of the password owner as salary, grade, appraisal history, company loans, personal account and expense reports. The owner of the password now will hesitate to share his password. Sharing it means sharing their personal data. If they still share it, there is only one more option we can do: Take it back. Zamri Zaini Dallas <<<>>> Title : All in the head Author : Mike Firth Source : CW Comm FileName: firthlet Date : Jan 9, 1989 Text: Gopal Kapur's article in a recent Reader's Platform [CW, Dec. 5] stomps heavily on the concepts presented in an earlier article. In the process, he uses an example that hardly supports his view. He states, ``Imagine a symphony in which the individual musicians are kept ignorant of the main score and know only their individual parts _ one in which the conductor commits the entire score to memory and nothing at all is written down.'' The problem here is that the first part is exactly how most symphonies work: The players are only given the parts they play. One reason is that it saves a lot of money to copy only the parts that each musician requires. Another is that musicians are proud of their ability to sight-read music and to jump from one piece to another. In most cases, the entire concept of the playing is in the conductor's head. The score commonly does not contain detailed comments but serves as a reminder of the flow. Some conductors do not use a score. Now we can wonder whether the rest of Kapur's self-serving comments _ since he heads the Center for Project Management _ are as far off base as they seem. Mike Firth Dallas <<<>>> Title : Oh no! Look out! It's Mag Author : Al Perkins Source : CW Comm FileName: perklet Date : Jan 9, 1989 Text: Computer magazines love to beat the drum and ridicule vaporware. We all read with horror and dismay, over and over, how one of the big boys has again fallen short on delivery. I would like to enter a new category _ Magscare. I just laid down another magazine with disgust at another story on how IBM has this monumental plot to overthrow the PC world by producing the Micro Channel Architecture. I believe journalists are to report the news in an unbiased fashion. But it's obvious that we have a host of pundits out there who feel their mission in life is to crusade readers into computer sects. The means may in fact be destroying any enthusiasm or hope in the future of microcomputing. Al Perkins Publications Specialist Embry-Riddle Aeronautical University Daytona Beach, Fla. <<<>>> Title : Good to know Author : James Hursey Source : CW Comm FileName: hurseyle Date : Jan 9, 1989 Text: Regarding the column ``Now you see it . . .'' [CW, Nov. 21], it is nice to know that I am apparently doing it right. I have always insisted that applicants for programmer positions come back for a second interview, during which they meet and talk with as many of the programmers they will be working with as possible, even to the point of going to lunch with them at company expense. This is to give me a chance to see what the chemistry is between the applicant and the staff. I would not hire an otherwise qualified applicant if it seemed he did not fit in well with the group. The most surprising thing about Ms. Ruhl's article is her assertion that most managers do not routinely do this. I find this hard to believe. James Hursey Programming Manager The Columbus Dispatch Columbus, Ohio <<<>>> Title : Selling IS out of the hou Author : Clinton Wilder Source : CW Comm FileName: cwfore Date : Jan 9, 1989 Text: What's a wood products company doing selling high-technology services?'' From a Weyerhaeuser Information Systems marketing brochure The idea of selling software applications or high-speed data network services outside one's corporation to generate profits can have an irresistible allure, particularly when the corporate world is demanding that MIS professionals be business strategists as well as technologists. But a successful software or services marketeer is a very different business strategist than an MIS executive who has developed and deployed a strategic information system. In the last few years, most traditional MIS organizations that have sold products outside their parent companies have focused on software or services _ disaster-recovery capability, for example _ that are already available in the marketplace. Few are willing to risk losing any competitive advantage gained through a strategic system by spinning that system into a marketable product. If outside buyers ``can put in a system we've developed that would give them advantage against the Weyerhaeuser company, we have to place a limit. We have to be extremely careful,'' says Susan Mersereau, vice-president and general manager of Weyerhaeuser Information Systems in Tacoma, Wash. With an increasing number of U.S. companies weighing strategic moves outside their traditional markets, the trend of MIS selling services outside parent organizations is likely to accelerate in 1989. In transaction-oriented industries like banking, large user firms such as Mellon Bank NA have sold data processing services for years. But within the past three years, key players in industries like manufacturing, construction and petroleum have joined in. There are very few established rules in the user-turns-vendor game. Of the broad spectrum of U.S. companies that are playing, no two are doing it quite the same way. If there are any guidelines, they might be these: Plan such a move carefully, start out small and go slowly. Aside from the basic elements of learning to market and sell in an industry as competitive as any other, becoming a vendor raises major, yet often subtle, relationship issues. If the relationship between MIS and internal end users is not clearly defined beforehand, for example, the jump into the computer industry can actually distance the MIS function from the corporate strategy it is trying to join. ``Selling outside services tends to detract from the main mission unless the [MIS] organization is doing an exceptional job taking care of internal customers,'' says Al Barnes, a Los Angeles-based managing associate of the Index Group, Inc., an information systems consultancy in Cambridge, Mass. ``Most [MIS] organizations are not well served by going outside and becoming a vendor.'' Yet many noncomputer companies are finding success by selling software and services; this success cannot always be measured by revenue growth or market share. Companies that assess the impact of their outside move on their internal customers see benefits on both sides. ``If you are only operating within a corporation, you're not in a selling mode; you're responding to demands,'' Weyerhaeuser's Mersereau says. ``As a business, we've become much clearer on relationship selling _ seeing [external and internal] users as partners.'' Like several other firms' computer vending units, Weyerhaeuser Information Systems includes internal MIS, with service to users in Weyerhaeuser's various business units accounting for about 85% of its revenue. Mersereau, formerly the parent company's director of telecommunications, says she is well aware that Weyerhaeuser Information Systems' primary charter remains the parent company's internal needs, even if that means lost opportunities in the marketplace. ``If you really wanted to compete, you'd set up a separate commercial business,'' she says. ``But you would lose the synergism and the resource utilization advantages.'' A similar philosophy rules at Pennzoil Co.'s Strategic Information Services Co., or Stratis, formed in 1988 and located in Houston. ``Internal customers are king,'' says Stratis Executive Vice-President Keith Eaton, ``and for the foreseeable future, it will stay that way.'' Stratis grew out of Pennzoil's ambitious 1983 decision to invest $50 million over a five-year period to revamp its MIS infrastructure _ at a time when its competitors in the price-depressed oil industry were cutting IS spending. Pennzoil decided there was little incremental cost involved in going outside to sell its revamped oil and chemical industry applications and network-based services such as electronic data interchange. ``We had already developed the software and made the investment,'' says Stratis President Patrick Manning. ``There was very little downside risk.'' Management involvement In addition to a strong relationship with internal end users, the success of an organization's computer services business also hinges on the involvement of its senior corporate management. At Pennzoil, for example, the Stratis board of directors includes Pennzoil's general counsel and senior vice-president/controller. ``Senior management has to see themselves as owners of this business,'' the Index Group's Barnes says. ``And you have to have a business plan. If you're just selling software outside, you're just selling _ you're not a vendor, you're not really in the business.'' The Bechtel Group, the San Francisco-based construction conglomerate, went through just such a thought process before it started up Bechtel Software, an independent business unit, in Acton, Mass., in 1987. Clients who used Bechtel's internally developed computer-aided engineering and project management software during a Bechtel project contract wanted to keep using _ and therefore license _ the software when the contract ended. In 1985, Bechtel allowed its various units to sell software that way, but senior management became concerned about the scattered approach taken to the idea of selling software. ``They felt Bechtel should either be in the software business or not do it all,'' says Bechtel Software President John Lucas. ``The way we were selling it was not a good way to protect our reputation as a company. We weren't set up to service the customer over the long haul. There weren't any clear guidelines for licensing, development and support issues.'' As might be expected from its location _ it is 3,000 miles from Bechtel headquarters _ Bechtel Software is completely separate from Bechtel's internal IS organization. Lucas says the subsidiary maintains close ties with the software development teams within its parent company's MIS, but as of now, with about 40 customers, Bechtel Software has plenty to choose from among Bechtel's 800 proprietary applications. ``We're not trying to sell all the software Bechtel has ever developed,'' Lucas says. ``We focus on those with commercial appeal.'' Another advocate of the arm's-length approach between MIS and software selling is an early pioneer among computer users-turned-vendors, Westinghouse Electric Corp. In 1969, the Pittsburgh manufacturer, having developed several IBM DOS/VSE utilities for its own MIS needs, saw a market for those utilities outside the company. Thus was born Westinghouse's Management Systems Software subsidiary, making it not only one of the first non-computer companies to sell software, but one of the first independent systems software vendors, period. One way a firm's internal users can help the vendor unit is to act as sales references. Such a relationship exists at Agway Data Services, Inc. (ADS), the computer services unit formed by the Syracuse, N.Y.-based agricultural products giant in 1987. Like Weyerhaeuser Information Systems and Pennzoil's Stratis, ADS grew out of Agway MIS, and most of its sales come from Agway business units. ADS calls on those same units to be sales references when it seeks outside customers for its data communications, remote processing, disaster recovery and other services. ``The first thing that you've got to do is gain respect,'' says ADS President Dennis LaHood. ``If you're not providing quality services to [your own] business, you're not going to be able to put forward the idea that you can provide those services to others.'' To LaHood, the biggest hurdle in becoming an outside supplier is learning to sell. ``The idea of marketing and sales is new to a traditional IS organization,'' he says. ``But once you get past the point where a customer has developed an interest in ADS' capabilities, everything else is similar'' to serving Agway business units. But there is a potential dilemma with a noncomputer company selling software and services to its competitors: The buyer is ostensibly purchasing the computer technology to enhance his own competitive advantage. The solution, companies say, is to assess whether competing products or services already exist in the marketplace. ``With a service like disaster recovery, if customers don't buy from us, they go to a competitor,'' says Weyerhaeuser Information Systems' Mersereau. ``The same is true if you're competing with off-the-shelf software.'' Pennzoil's Stratis sticks to generic software in oil production, oil royalty revenue reporting and basic financial applications. ``Ninety-nine percent of the software we sell is available out there in the marketplace, so there's no conflict,'' Manning says. Even if the software seller solves this competitive dilemma, there remains the possibility that internal and external customers' future needs may diverge, particularly in future enhancements and upgrades. ``There can be enough differences among your customers that you'll end up with lots of versions of the same application, and that can cause you to lose control of the base system,'' Barnes says. ``If you're selling outside, the management process of enhancing a system becomes much more complex.'' All these issues _ and the overriding issue of the implications for a user organization's corporate mission of making the computer-industry-vendor plunge _should be assessed carefully beforehand by both MIS and top corporate executives. Some companies have hired outside consultants for help, as Agway did in 1985 when it contracted John Diebold to study its MIS strengths and market opportunities. Not only is the strategic planning process of paramount importance in considering the question of vending MIS products and services, but the decision process itself can be invaluable _ even if the answer is no. An MIS department's examination of whether it should go into business can be the best way to redefine its role in its firm's business. ``Top management has always had questions about IS; this is the opportunity to talk their language,'' Barnes comments. ``Whether you sell outside or not may depend on your capabilities vis-a-vis your competition, but there's a benefit in thinking about it.'' By Clinton Wilder; Wilder is Computerworld's senior editor, computer industry. <<<>>> Title : Revolutionizing the worki Author : CW Staff Source : CW Comm FileName: gerrity Date : Jan 9, 1989 Text: Computers and other information technologies provide three primary benefits to an individual, business function or an entire organization: efficiency, effectiveness and transformation. Understanding the nuances of each benefit sheds light on how computers have changed the way people and their organizations conduct business. Today, given intense marketplace and competitive pressures, more and more companies are attempting to go beyond efficiency and effectiveness gains from information technology. They are embarking on truly transformational work changes involving the use of information and information technology. Applications of information technology that create efficiency allow a user to work faster and at a significantly lower cost. An example of this would be automation of the clerical work in insurance claims processing. Effectiveness applications result in better, higher quality work. For instance, when a large company with autonomous divisions implements a common purchase order information system, the system creates efficiency because it streamlines an onerous paperwork process. However, because it also enables the company to see, for the first time, its overall purchasing relationship with each supplier, there is an effectiveness benefit: The company can negotiate volume discounts. Transformational benefits alter the fundamental way people, departments and even whole companies work with each other, creating a major competitive advantage. The information systems Otis Elevator Co. uses to improve the servicing of elevators speed the response time to service calls (efficiency), enable the company to provide better service because the system keeps historical service data on each elevator in use (effectiveness) and provide an ongoing diagnosis of elevators in use for preventive maintenance (transformation) and for locking in the service call. One of the major transformational changes in business today is toward cross-functional integration and creating the flexible organization. In this area, communications and networking technologies are causing powerful changes in the way people communicate, work together on tasks and make decisions. In an evolutionary way, technology is enabling people from a variety of business functions _ marketing, sales, purchasing, manufacturing, administration, research and development and so on _ to work much closer together in the provision of a product or service. Historically, these departments, and the people who worked in them, were segmented from others within the firm. Tech breakdown Today, however, communications and networking technology (and computers) are breaking down those barriers to interpersonal communication. The technology is making it far easier for various corporate constituencies to exchange information and work together, though not necessarily in close proximity, on projects. Because the marketplace is becoming only more competitive and changing more rapidly, companies, more than ever, are looking to derive transformational benefits from information technology. Many of those involve using communications technology that facilitates the actions of people across functions to improve service, speed product development and share valuable information about customers. In people's everyday work life, these changes will result in more employees working at home from a computer that can communicate with headquarters or other offices in the company and heavier use of electronic and phone mail. As new technologies come to the scene, we will leverage them to further revolutionize the way we work. <<<>>> Title : Downsizing an opportunity Author : Michael Alexande Source : CW Comm FileName: downsize Date : Jan 9, 1989 Text: Corporate America is in the midst of a battle to become lean and mean, even while many companies are gulping down others like fish in a feeding frenzy. But the way many corporations are trimming the fat is not by divesting unprofitable divisions but by downsizing their MIS operations. ``Basically, downsizing is the idea of taking systems that may have been on a mainframe or minicomputer, perhaps with complicated connections into other systems, and surgically removing them and placing them on personal computers,'' explains Theodore Klein, president and founder of Boston Systems Group, a consulting and systems development firm headquartered in Boston. ``It could also mean shutting off the lights in the data processing center, disassembling [the center] and dispersing it throughout the company.'' Klein estimates that 20% to 25% of Fortune 1,000 companies will be downsizing their information technology systems in 1989, and 40% to 50% of those companies will be downsizing their systems the following year. He believes that the downsizing process will be completed in the mid-1990s, when virtually all of the top corporations in the nation will have decentralized at least a portion of their computer operations. Downsizing from mainframes to micros on local-area networks is reshaping the minicomputer industry, says The Sierra Group, Inc., a market research firm based in Tempe, Ariz. In 1988, the group surveyed more than 2,000 companies and found that 30% of them are using PC networking technologies, up from 24% in 1987. Based on current growth rates, LAN-based solutions may easily overtake centralized departmental minis by 1992; by 1995, networked topologies will represent the larger share of installations, The Sierra Group forecasts. There are at least two fundamental reasons behind this rapid adoption of downsized systems. The first is that the price/performance ratio of personal computers is improving virtually by the nanosecond, leading some analysts to predict that as much as 80% of corporate computing power will be on desktop systems by the mid-1990s. In only a few years, it will not be uncommon for many white-collar workers to have mainframe computing power on their desk tops, says Don Tapscott, an expert on office automation and end-user computing for the DMR Group, Inc., a management consulting firm with offices throughout North America. With cheaper and more powerful desktop computers will also come machines and multimedia interfaces that are more ``people-literate'' than the other way around, Tapscott points out. The end-user revolution Perhaps the most potent force fueling the downsizing engine is the rise in end-user computing. Consider that a majority of white-collar workers will already have had 10 to 15 years of experience working with personal computers by the mid-1990s. These same workers have moved from experimenting with PC technology to applying it skillfully in ways that simply were not thought of only a few years ago. Indeed, in many corporations, end users have detoured around the applications backlog in many MIS shops and are actively developing sophisticated applications on PCs. ``In the old days, things were expensive and expertise was thin, so it made sense to centralize and take advantage of the economies of scale,'' Klein says. ``But companies are looking at downsizing more, now that PCs are so widespread and end users have built up their computer expertise.'' Once end-user computing has taken hold in a corporation, senior managers begin thinking more seriously about downsizing, he says. ``Senior managers often start looking at downsizing as a way to cut bloated bureaucracies, which are often in MIS departments,'' Klein says. ``In other cases, it starts growing out of end-user or departmental expertise. End users begin rebelling against paying exorbitant mainframe charges, and there is a bottom-up swell that leads to change.'' ``Mainframes are expensive, and the on-going costs are pretty hefty _ that's what typically gets information systems managers thinking about downsizing,'' says Randy Gottwaldt, who manages information systems at Imprimis Technology, a Minneapolis-based disk drive manufacturer. ``Also, if you have a lot of stand-alone PCs scattered around, management starts looking for ways to make better use of them.'' Downsizing can be of strategic importance to a company, perhaps to enhance its competitive edge, Gottwaldt says: ``For us, that was a significant consideration. Depending upon your product, you may need to spread the costs around quite a bit, especially in a high-volume low-margin business like this one. The savings of downsizing are fantastic. ``Many businesses want to do something radical because of a change in the business climate, say, when they go from facing competition on a domestic level to competition on an international level,'' he adds. ``That creates a lot of pressure to succeed, and downsizing can help.'' Imprimis Technology, which is wholly owned by Control Data Corp., decentralized a manufacturing application on the mainframe to a local-area network, Gottwaldt recalls. The project was started three years ago. A successful transition requires the enthusiastic endorsement of upper management, he says: ``They have to be visible to end users. Their having a good attitude about the change also helps because it filters down through the ranks.'' Information services managers who have successfully downsized cite numerous benefits, not the least of which are the potentially huge cost savings and added flexibility that comes from putting PCs instead of terminals on desk tops. ``We spent a bit less than $100,000 to buy 75 personal computers,'' said one IS manager who asked not to be identified. ``That included everything _ cables, network cards, software and several other things. That's about what we were spending per year in mainframe fees alone.'' Cost-cutting the key The primary consideration was the cost savings, concurs Roger Goss, coordinator of system interface at Eastman Kodak Co. in Rochester, N.Y. He recently completed a downsizing operation that replaced a mainframe publishing system with a local-area network of IBM Personal Computer ATs and Apple Computer, Inc. Macintosh SEs in the photographics products group publication division. ``We eliminated an annual fee and used the money to buy new computers and get new technology in the process,'' he says. ``The price of the whole network, including hardware and software, was about equal to the cost of one year's mainframe fee.'' A second _ though equally critical _ consideration was that the previous system, used to publish manuals and documentation, was ``not friendly to someone who needs to create a lot of words,'' Goss says. ``It used proprietary software, and we had no flexibility. Now our systems come with whatever software we want, and if we're not happy, we can switch.'' The network works like a charm, day in and day out, Goss says. ``The only problem came when we were upgrading the network software, but even then, during the time that the network was down, people were still working,'' he says. ``The IBM PC AT users could not do any printing, but they knew that in advance and were able to pull off the files that they needed for that. The Apple Macintosh users were able to continue right on because they were connected to laser printers on an Appletalk network.'' The modularity of the network _ the ability to swap hardware and software at will _ is one of the key benefits to downsizing, Goss says. ``You can swap a PC AT for a Macintosh or vice versa, and you only have to train one person at a time if that is all you are able to do,'' he points out. ``Where budgets are tight, you don't have to buy everything in one fell swoop. Those are definite advantages.'' PCs, servers, software and related networking products are available from a variety of sources, and MIS managers can leverage competitive differences among suppliers to increase their buying power, according to several MIS managers. Incompatible platter Downsizing is not without a downside, however. When end users are left without guidelines, they are apt to assemble a smorgasbord of incompatible LAN environments. Their purchases may be naive and redundant, ultimately putting a heavy burden on MIS resources. ``Our biggest mistake was that we didn't follow through properly on restructuring,'' says Gary Biddle, corporate vice-president of management information at American Standard, Inc. in New York. It cannot be assumed that systems become simpler when they get smaller, he warns. Don't forget applications ``If you are going to downsize, you can't forget to downsize the applications as well,'' Biddle says. ``In one instance, we had a system supporting 14 levels of management, where really there should only have been five. There is more to it than just identifying a few flashy applications to downsize.'' Downsizing is not without ramifications for the data center in ways that may be unexpected by many IS managers. The influence that IS holds in the corporation disperses as rapidly and as widely as the computer systems they have been charged to decentralize. ``The changing information economics that will put 10 MIPS on a desktop will mean there will no longer be need for a central computing facility,'' Tapscott says. ``The organizational structure will change,'' Klein adds. ``Senior management will need to cope with such issues as deciding to whom the company's MIS professionals will report. Will they report to a CIO or to the department head where the network is located, for example?'' With the downsizing of computer systems could conceivably come a downsizing of the data center, Tapscott explains. In the extreme point of view, there may no longer be any need for a data center. There is certainly a relationship between the two _ downsizing systems and downsizing the data center _ but the issue is one of finding a balance between dispersing the technology throughout the organization and keeping the data center, Klein says. One of the key roles of the data center will be to ensure connectivity and to build the platforms on which these dispersed systems will sit, the analysts agree. MIS will control and manage the connectivity of the company's networks and set guidelines and methodology for using the network. ``The analogy that I use is as follows: When the first automobiles were introduced, you had to be a big strong hefty man with a knowledge of mechanics to drive one, so there weren't many on the road,'' Klein says. ``Now, anybody can do it. The MIS department will be needed to lay down the roads, put up the signs and even put in a police unit. But they won't be needed to build the cars any more.'' By Michael Alexander; Alexander is a Computerworld senior editor, microcomputing. <<<>>> Title : Discovering tools for inf Author : CW Staff Source : CW Comm FileName: foremora Date : Jan 9, 1989 Text: The competition within retail, insurance, real estate and financial services is intense. We at Sears, Roebuck and Co. must be able to anticipate _ or at the very least react quickly to _ changes in our business environments to stay ahead of the competition. Effective communications and information management is essential. Our people must have quick access to information whenever and wherever they need it. They also need the ability to access that information in whatever format they want so that it is useful to them. Information technology is helping us to respond to these requirements at Sears; as a result, major changes are taking place in the way we do our work. Modeling tools are allowing us to see the impact of our ideas without actually having to implement them first. Forecasting tools help us anticipate our markets so we can be active rather than reactive. We expect both sets of tools to become even more sophisticated in the future. Electronic mail is already a regular part of our everyday lives. We send and receive messages, letters and documents throughout the organization at our convenience _ instantaneously, any time, day or night, to and from any location. Personal computers and portable CRTs even allow access from our homes or while traveling. Because the correspondence can be stored and retrieved at later dates, paper is largely being eliminated. Finding alternatives to handling the ever-growing mountain of paper is increasingly important. Paper itself is not only expensive; it is also difficult to distribute, file, store, retrieve and manipulate manually. Internally, our people are now receiving many reports on-line. The reports are available for viewing moments after they are generated. We are also starting to experiment with image-management systems, which can capture many images such as graphics in addition to text. We believe these systems will become even more important to us in the future as their prices come down and their capabilities improve. The vast amount of information we collect is being stored in subject area databases, most of them relational. This storage allows us to manage the information independent from applications to assure that it is consistent in meaning and structure and can be shared across the firm. In addition to stand-alone processing, end-user computing tools are already providing our business users with access to these databases and, thus, more flexibility and control of their information systems needs. They can readily view, manipulate and print the information they desire. They can download information from the mainframe databases to their PCs for local processing in various formats such as spreadsheets. All managers are now responsible for directing systems development just as they direct the other product, service and cost aspects of their businesses. They can no longer ignore the use of information technology; it is now part of their job rather than strictly a responsibility of the MIS organization. Information technology has become a strategic resource on a par with people and capital. It is not only a means to reduce cost; it can also be thought of as an investment to create future increased revenues as well. Just as business goals drive the development and use of information technology, imaginative use of information technology can create business opportunities. Our challenge in the future will be to keep abreast of the rapidly changing technical environments so that we will be in a better position to look for strategic use of information technology. Our work environment will certainly continue to change as a result. <<<>>> Title : Integrated net management Author : Patricia Keefe Source : CW Comm FileName: tkforcas Date : Jan 9, 1989 Text: The current state of integrated network management brings to mind the rock 'n roll lament that ``you can't always get what you want.'' But what really kills today's frustrated network managers is that, try as they might, sometimes they still can't get what they need. And in a way, they have no one to blame but themselves. In their rush to get out from under the tyranny of single-vendor solutions, users are discovering that mixed standards and multivendor environments have created a Pandora's box of network management nightmares. In short, you can't have your cake and eat it too _ at least not today. ``I talk to two or three users a day who are screaming their heads off over this. They are sick of having a bunch of tools that don't jell,'' says Jeremy Frank, program director at the Gartner Group, Inc., a Stamford, Conn.-based market research firm. What many users say they want, what some industry pundits say is coming and what vendors swear they will provide _ someday _ is an integrated, comprehensive centralized network management package. User pressure has succeeded in forcing vendors to at least develop and publish blueprints for network management systems. ``Users have made network management a prerequisite for consideration in almost every [request for proposals],'' says Jeffrey Kaplan, director of network and professional services for The Ledgeway Group, Inc., a Lexington, Mass.-based consulting firm. But the resulting architectures have amounted to just so much paper and ink. ``There's been no real progress this year. You can't buy any of this stuff, can you?'' asks Frank Dzubeck, president of Communications Network Architects, Inc. in Washington, D.C. So the vast majority of users such as Dennis Turek, a software analyst with Anheuser-Busch Companies, Inc. in St. Louis, are forced to continue to resort to ``roll-your-own'' network management. ``Users absolutely have no choice,'' Frank claims. Fortune 1,000 data communications managers surveyed earlier this year by International Data Corp. (IDC) in Framingham, Mass., revealed that 26% _ the highest percentage responding _ plan to develop their own network management systems. ``The smart people have to fashion their own [management system]; the people who wait for one to come by will fall by the wayside,'' predicts American Cyanamid Co.'s Joseph Kascik, a manager of network planning for the Clifton, N.J.-based company, which has a worldwide virtual integrated voice/data network. Pulling in the reinsIt is ironic that in this age of open network hype, users now find that it behooves them to begin backing away somewhat from their eager forays into multivendor connectivity. Of course, no one is proposing a return to proprietary vendor systems. But where users are already grappling with long-term network strategies and standards, it is easy enough for some MIS departments to begin pulling in the reins on diverse network implementations. The goal is to winnow a wealth of competing networks down to two or three manageable systems. ``There's a strategic direction here to move three networks into one,'' admits Ron Robeck, manager of technical services at Affiliated Bank Services in Thorton, Colo. Affiliated has standardized on one network management package but uses three different networks mostly because of the requirements of different hardware. Anheuser-Busch runs two networks, one from IBM and one from Prime Computer, Inc. ``They are pretty separate right now, but we're looking at how to tie them together,'' says Turek, who adds that his department is looking hard at LU6.2. Each network maintains a database that must be kept in synchronization with the other. Turek notes that it would be a lot easier to run one database on one network. After consolidating networks, the next step for users is to select a network management approach. Most of what is out on the market today seems to feature compatibility with one of two adversarial camps: IBM's host-based Netview management system and the Open Systems Interconnect (OSI) network management standard, CMIS/P. Lacking one be-all and end-all management system, users are further faced with dividing their choices into one of two approaches: a single-vendor solution or a mix of network management tools that may or may not interoperate. Users have to be careful to weigh short-term goals against long-term needs. The wrong technology decision could boomerang painfully at some future date. ``We try to purchase management equipment with an eye toward interacting with the systems that exists and with what we know we'll have to acquire,'' says Jeff Harris, director of planning and technology for Mattel, Inc. in Hawthorne, Calif. Mattel is in the midst of revamping a worldwide ``kludged'' network that had no real network management in an effort to integrate voice, data, facsimile and telex. The first thing users see when they begin their quest for network management is a veritable sea of choices bobbing with different network platforms, all festooned with ``open interface'' banners. According to an April survey by Forrester Research, Inc. in Cambridge, Mass., users are targeting their dollars at IBM's Netview, AT&T's Unified Network Management Architecture, OSI and Digital Equipment Corp.'s Enterprise Management Architecture (EMA). Even so, the market is so fragmented _ and clearly still up for grabs _ that the leading segment in the survey was the ``other'' category, which garnered the largest purchasing share, 37%. While they wait for vendors to flesh out their strategies and support for multiple environments, many users are taking a deep breath, crossing their fingers and plunging forward into homegrown solutions. One approach calls for selecting a few areas to monitor _ for example, session management, physical-layer management and packet-switch nodes. A mix of network management systems usually does the job, but more often than not, the systems are completely separate. ``From a network control perspective, we basically are going for the best equipment that we can find for the particular function, but it has to interoperate with what we've got,'' Harris says. His IBM Systems Network Architecture network is standardizing on IBM's Netview, with a Datapoint Corp. server functioning as the gateway to all non-SNA systems. ``We've made some pretty cavalier decisions in terms of moving forward,'' he points out. ``I'm afraid that one of these days it will turn out that one of our primary [business partners] has a whole office filled with Wang or something, and I'll go `Oops!' When that happens, I'll have to get real smart, real fast.'' About 50% of the IDC survey respondents said they currently use from one to three network management systems. Roughly 30% use between four and six systems; another 2% use more than six; and about 10% use nothing at all. Interconnection futures Many users answer the challenges of interconnection or interoperability by resorting to reentering a printout from one system on another system's console. In lieu of concrete offerings from vendors, users are adding some elegance to the ``swivel-chair'' approach, says Marvin Chartoff, an analyst with Ernst & Whinney in Fairfax, Va. In other words, move all the consoles for the different systems together, allow them to dump data into the same printer and start gearing up for automated operations. A number of vendors are taking steps to provide for interconnection, but again, they're talking futures. For example, Hewlett-Packard Co. and AT&T are publishing the specifications to their management architectures. IBM released the specs for Netview/PC to third parties, and OSI cheerleader DEC indicated it may provide specs to its recently announced EMA sometime next year. All these companies have promised to migrate their systems to the OSI Network Management standard _ once it becomes finalized, which may be as much as three to four years away _ or to at least support the OSI communications interface. Yet even OSI provides no guarantee of interoperability. As users of CCITT X.400 gateways are finding to their increasing dismay, just because two products comply with the standard doesn't mean they can talk to each other. It also means users run the risk of getting locked into proprietary implementations of these standards. The solo approachThe alternative approach is to go with a single-vendor solution. ``I believe the one-vendor philosophy is a little easier to coordinate,'' Affiliated Bank Services' Robeck says. But this approach is not always the most realistic one. ``It's true that staying with one homogeneous approach is easier on development overhead and system administration, but given the real world, in most installations there is a real need for more interoperability,'' says Clare Fleig, director of research at International Technology Group. She sees the single-vendor approach as more suited for larger, terminal-based systems. Users in this camp risk potential lock-in and commitment of the future of their network to one vendor, Chartoff says. ``You can also eventually restrict the types of applications that you support,'' he adds. Actually, most network management systems combine vendor-supplied network hardware and a mix of third-party packages and internally developed applications and systems. For example, the New York-based Financial Industry Standards Organization (FISO), a consortium of some 20 financial services companies that are trying to develop a common communications standard for their industry, is planning to use a mix of OSI and proprietary networking solutions. But some users such as Cyanamid's Kascik view standards with considerable suspicion. ``I'm not waiting for OSI,'' he maintains. ``Standards would be nice, but vendors won't allow real good integration unless you are willing to write hooks into their systems.'' Regardless of which way a user turns, it is imperative that they start moving now, Frank says, warning, ``If you don't start today, you won't have a prayer. The average 3090 generates 1,000 messages per second. The average network control operator can read one per second and act on one every two seconds. Consider that CPU activity is supposed to go up 12% a year. So if you think things are out of hand now, in a couple of years it will be worse.'' Users can begin to lay the foundation for network management even if they are undecided about which direction to take, Gartner's Frank says. ``You have to realize that all network management decisions today are tactical _ the strategic stuff is five years out.'' He advocates organizing ``tedious, low-level, time-consuming tasks,'' such as the following: Combining databases, or at least making multiple databases appear as one entity. Starting now to build a pool of network management specialists and cross-training at every opportunity. Knowing what you have, where it is and what you need. Moreover, users should determine what they really need to manage their networks. ``Vendors can only provide a reasonable solution to the extent that they get reasonably stated problems,'' notes Thomas Nolle, president of Haddonfield, N.J.-based CIMI Corp. Regardless of the path that they take, users need to make certain that their chosen management strategy will help them control their networks rather than give vendors a way to control the account. Advises Nolle: ``Select the network management system that seems like a fit to your problems, even if it's not integrated. It's much better for you than one that is beautifully integrated but does not meet your needs.'' By Patricia Keefe; Keefe is a Computerworld senior editor, networking. <<<>>> Title : Which is the key network Author : Patricia Keefe Source : CW Comm FileName: netside2 Date : Jan 9, 1989 Text: A bewildering array of architectures are vying to be the central network management system that will control and monitor users' enterprise networks. Vendors have been announcing these products at a steady clip during the last two years. The bad news is that most of them are pure vapor. On the upside, the suppliers of many of these so-far-transparent offerings plan to either publish their technical specifications or at least open up their systems to other vendors' products via an Open Systems Interconnect (OSI)-compliant interface. The following is a list of some of the key network management architectures: OSI's Network Management Forum, which is developing specifications to ease multivendor network management interoperability. The profiles for OSI Layers 1 through 6 and the first three sublayers at Layer 7 _ including the draft OSI proposal for Common Management Information Services and Protocols _ should be set by the end of 1988. There are at least 33 members, including AT&T, Hewlett-Packard Co. and Northern Telecom, Inc., IBM and Digital Equipment Corp. do not participate. IBM's Netview, a host-based network management umbrella for a range of Systems Network Architecture monitor and control tools. IBM has said it will support an interface to OSI. Netview/PC is the mechanism by which IBM supports non-SNA networks. Both products were announced in May 1986. DEC's Network Enterprise Management Program, announced in September 1988, reportedly will provide users with a single, comprehensive solution to managing multivendor, geographically dispersed, enterprisewide environments. The Enterprise Management Architecture is said to provide a foundation for the integration of OSI standards. AT&T's Unified Network Management Architecture, a three-tier blueprint that reportedly will let users manage nine areas, including fault isolation, performance, security and integrated control. It was announced in September 1987. Cincom Systems, Inc.'s Netmaster is viewed as a modular clone of Netview, although users and some analysts maintain it is easier to use, has more features and is more easily tailored to a particular network's needs. Introduced in April 1984, it did not really take off until 1986 when IBM entered the market with Netview. HP's Openview is described as a comprehensive, integrated set of software, hardware and support products that embrace the OSI network management architecture. The system features a graphical interface based on Microsoft Corp.'s Windows that reportedly will provide network managers with a standard integrated way to manage multivendor networks. It was announced in March 1988. PATRICIA KEEFE <<<>>> Title : CIO's lives are turned to Author : James Connolly Source : CW Comm FileName: turn3 Date : Jan 9, 1989 Text: More than a few eyebrows were raised _ and, presumably, more than a few resumes were checked _ last spring when a survey of chief information officers revealed that more than one-third of their predecessors had been fired or demoted from their jobs. That survey, by Touche Ross & Co., might help CIOs argue for hazardous duty pay. The researchers uncovered an annual CIO turnover rate of 17% leading up to 1988 and other signs indicating that ``CIO job security'' is a contradiction in terms. Several observers guess that the 1988 rate will double that figure. CIOs _ whether they possess the formal title and act as corporate information czars or merely rank as top computer executives _ are changing jobs in an atmosphere of frustration and controversy. They are being forced into the streets by disgruntled chief executives and corporate mergers, and they are jumping to grab jobs that are new opportunities for them and bad memories for their predecessors. The list of well-known top MIS executives on the move grew throughout 1988. Pillsbury Co.'s John Hammitt jumped to United Technologies Corp. and was replaced by Carl Wilson. The Travelers Co.'s Joseph Brophy took a job in the company's employee benefits group and was replaced by Lawrence Bacon. John Hancock Mutual Life Insurance Co.'s Edward Boudreau was promoted to president of a company subsidiary. Sun Co.'s Dudley Cooke took early retirement when the company reorganized. Other ranking MIS managers left the business for professions ranging from the ministry to podiatry. James Kubeck, president of Whiting, Ind.-based management consulting firm K4 Enterprises, estimates that the number of CIOs seeking his help in outplacement counseling doubled in 1988 over 1987. Is the turnover madness abating? ``No,'' say observers, ``It will get even worse.'' ``The CIOs are asking, `Who am I, and why am I going through this tremendous frustration and discombobulation professionally?' '' Kubeck says. ``The phenomonon is partly due to the greater visibility of the position. The CIOs are more visible because they are closer to the top of the organization,'' notes Michael Simmons, a Bank of America executive vice-president. Holding steady Simmons himself is being tossed about in the turnover storm. He moved up a career ladder through mid-size financial institutions to Fidelity Investments subsidiary Fidelity Systems Co., where he took over the role of CIO in 1984. But a disagreement with his company president _ Simmons arguing that Fidelity should stay with a centralized strategy for the time being rather than move immediately toward decentralization _ led to Simmons' resignation and eventual move to Bank of America. Even at that bank there was CIO-level turmoil as Max Hopper ended a short tenure by returning to American Airlines and being replaced by former Seafirst Corp. executive Louis Mertes. Mertes then left the top Bank of America IS job in October 1987, due at least in part to the failure of a costly, long-term pension management system, which he inherited. Simmons then became Bank of America's third CIO in three years. Failing to meet expectations is one of the key causes of CIO turnover, says Zale Corp. Vice-President for MIS David Karney. Karney, who until Nov. 1 was the top computer executive at another Dallas-area retailer, Southland Corp., notes, ``I've seen various statistics that say a senior data processing person in any major organization has a three- to five-year tenure on average. I guess I am fairly typical, if that is the case.'' He changed jobs after five years at Southland because of the career challenges that a speciality retailer such as Zale offered. ``It's easy for a chief executive officer to think that someone can come in and make sweeping changes in the systems organization, but if the mind-set of the company is not ready to change, the information executive can't really pull off the changes that have to be made,'' says Karney, who notes that CIOs may possess their own misunderstandings of what they can accomplish. He urges that CIOs exercise patience as they learn their business. Many CIOs _ such as The Travelers' Brophy _ have worked 25 years or more in the same field and are seeking a change of environment, Simmons says. He also notes that the top MIS post is stressful to start with and is made more so by arguments about corporate strategy, such as those relating to decentralization. ``Centralization vs. decentralization is purported to be the crux of the whole argument,'' Simmons explains. ``But the point is that it doesn't matter whether you centralize or decentralize, as long as it works well for your own data processing organization and your own company. . . . They [CIOs, CEOs and users] are arguing about the chipped paint on the deck while the boat is sinking.'' Simmons advises that the two best ways for CIOs or CIOs-to-be to avoid becoming a turnover statistic are to learn as much as they can about their company's business and to stop talking in MIS dialects to their peers outside MIS. Yet while the common advice to CIOs is to get more involved in general business management, one management consultant says a leading cause of turnover in the future will be CIOs who understand business but have drifted away from technology. Bruce Rogow, executive vice-president for worldwide analytical resources at The Gartner Group, Inc., listed four causes for what he perceives as a doubling in the CIO turnover rate in 1988. The first three causes were similar to those cited by others in the industry: CIOs, whether good or bad at their job, being caught in a numbers game as the merger of two firms leaves one position. ``Mediocre to good CIOs staying one step ahead of the posse'' by jumping to new jobs before they can be fired. Senior management becoming dissatisfied with a CIO when even a capable CIO cannot adapt the IS strategy as the company changes rapidly. Rogow's fourth cause sets him apart from other observers: ``The one I think is going to create havoc in the next few years is that most CIOs are not prepared to deal with what is happening in technology. They have taken the `T' out of IT _ information technology. . . . You can't forget that you are still the chief technology officer.'' He says too many CIOs focus on building strategic systems of the type made famous by American Airlines and American Hospital Supply Corp. while ignoring the need to recognize which technologies, which products and which vendors can fit together to help the CIO's company. Rogow cites as an example the rapid changes in database technology and notes that a CIO must understand the differences among varied offerings and how those new platforms differ from what MIS used for over a decade. The man generally credited with originating the term ``CIO,'' William Synnott, director of the Banking Division of the Nolan Norton Institute, says that turnover will continue. Synnott notes that the ``old era'' of DP was stable and that managers felt safe as long as they provided the needed services and contained costs by maintaining economies of scale. MIS is moving away from that back-shop environment, Synnott says, and some managers are unprepared for the new world. Unlike Rogow, Synnott expects more of the positions to be filled by general managers _ executives who rose through the corporate ranks in non-MIS positions such as finance and marketing. ``When that occurs, a lot of MIS managers who were passed over feel usurped and start leaving,'' Synnott says, adding that other CIOs or potential CIOs quit when they see their empires diminished by decentralization. The CIO who may become CEO is unlikely to be one who spent a career in MIS, he says. The former CIOs who became CEOs, such as John Reed of Citicorp and Robert Crandell of American Airlines parent AMR Corp., spent most of their careers in non-MIS positions. Synnott, like many other consultants and CIOs, advises those climbing the IS ladder to step out of IS to get better business experience. Harold Cypress, western practice leader for information technology at Arthur D. Little, Inc. in Los Angeles, says that a general business background helps a CIO candidate understand where the company is heading. ``Most of the CIOs I know spend as much time worrying about how things are going to look in the future and what the business' needs will be in terms of systems support as they do worrying about day-to-day operations,'' he explains. Cypress, who says the perceived increase in turnover might actually be attributed to the fact that some companies are creating CIO positions, cites two common mistakes in CIO hiring. First, CEOs tend to get a profile of the type of person they want based on what they read about successful CIOs in other companies. Second, CIOs, even when approaching retirement, neglect to groom successors with general management and IS experience. He notes that most CIOs have five or six direct reports who focus on technology, rather than acting as general managers. Another consultant who reports minimal CIO turnover in his area is Michael Anderson, regional director for management consulting at Groupe DMR, Inc. in Toronto. ``In Canada, the turnover rate may not be as high as in the U.S. But there seem to be more and more people who are in the CIO role, no matter what it is called,'' he says. Two career paths Anderson suggests that one solution to the question of whether general business experience or technical expertise provides the power to move up the corporate ladder would be for a company to offer two career paths with comparable pay and prestige. One would be a technical route leading to a chief technology officer position, while the second would be more business-oriented. Will CIO turnover continue to grow? One of the researchers who worked on the Touche Ross study says yes, blaming the ongoing causes such as mergers and changing CEO expectations. But Thaine Lyman, a partner in Touche Ross' Chicago office, also offers one more factor. ``The free lunch _ the big increases in budgets that IS has become used to every year _ is gone. That, combined with the existing pressures on the CIO, indicate that the turnover may get worse,'' Lyman says. But Lyman also sees a ray of hope for CIOs who become statistics in 1989. ``You don't find the guys who are terminated out pounding the pavement for a long time. They always seem to find another job.'' By James Connolly; Connolly is Computerworld's senior editor, management. <<<>>> Title : Computers shift our way o Author : CW Staff Source : CW Comm FileName: vanlear Date : Jan 9, 1989 Text: I think we can all point to numerous examples of the impact of computers on our work lives. The personal computer revolution has created new cottage industries and enabled the distribution of both data and function into the hands of end users. Technology has found its way into the fields of education, entertainment and, with intelligent workstations, even into the area of application development. The shoemaker's children are finally getting new shoes. More importantly, however, computers and computer-related technologies are changing the way we think about our work. Old solutions and approaches are being challenged as to their appropriateness in a more technically advanced society. On-call programmers see no reason why they must commute to the computer center to fix problems that could be solved at home. End users do not see the necessity of involving programmers to solve their information and reporting requirements. Quietly and at an ever increasing speed, computers are influencing how we are structuring our business environment and product delivery mechanisms. These changes, while having a profound technical impact on our environment, are also presenting us with new social and moral issues. Expert systems, the design and database of which depend on input from knowledge engineers, pose no small threat in the eyes of these engineers as to the value of this knowledge and who owns and controls its use _ sticky issues to be sure. Information overload _ too much data too quickly available _ is a major by-product of the computer's influence in our workplace. We are being inundated with data that may or may not be useful to us in our decision-making processes. The management, interpretation and transformation of this data into useful information has created a whole new role for knowledge workers. Computers today control our manufacturing, communications, transportation, defense and finance systems. They are an important part of our educational system. And as the next generation of workers enter the marketplace, computers will be as necessary to them as calculators and phone systems were to their predecessors. <<<>>> Title : What's your mandate? Author : CW Staff Source : CW Comm FileName: mandate1 Date : Jan 9, 1989 Text: ``I'm striving to get the senior management team much more involved in the strategic planning of the use of information technology. The automobile industry is extremely competitive . . . and however we can use the computer as a weapon to succeed is the mandate. Part of the strategy is to get tools into the hands of every employee. Whatever we can do to help people do their jobs, whether the tool is equipment or access to the main host, we'll try to give it to them.'' ``Our primary mandate is to develop and support some new products and services the company will be offering. Second, we will begin a major upgrade in our operating system software from VM/VSE to VM/XA and MVS/XA and eventually to MVS/ESA; it will be a multiyear project, but in 1989 we will make the most significant change. Third, we will define architectures that will allow the use of LANs and PCs in some of the major business units. We will also be upgrading the IBM 3090 we just installed.'' ``For 1989, we will be focusing on quality, customer services and profitability. Our MIS group is unique because we're close to the company's business projects, such as the 800 and 900 services _ those are company products, but they are really MIS responsibilities. Also, we're going to be opening up our back-end systems to let our customers use them as well. . . . We will let our customers come in and look at their own records in detail.'' ``The biggest thing we're trying to accomplish in our industry is to constrain the cost of data processing. The other significant issue is measuring our contribution to the bottom line. Our focus is changing from doing everything for everybody to really managing our corporate use of DP and getting a good return for it. ``We're looking for a place to test out artificial intelligence concepts . . . and we're looking at building a marketing base that is very new to our business. We have a lot of equipment out there, and we're looking for innovative ways to use it to generate revenues.'' ``We're continuing to develop new software and information systems. We will also focus on developing global information systems. In the next two or three years, we're going to focus 100% of our attention on an international information system so that we will essentially have identical systems working in about 100 countries. The third aspect of our future emphasis will be on research and development for our channel devices [small personal computers that allow customers to do business with Federal Express on-line].'' ``Cost containment and cost reduction are the broad issues we will address in 1989. We will look at what is best for the company _ centralized or decentralized mainframe processing. We've decentralized in the applications development area over the company's nine divisions. Other things that are the significant driving forces for us are customer service, quick response and electronic data interchange. We also want to expand our use of CAD in fabric design.'' ``The bottom line for us is providing a cost-effective accomplishment-oriented information resource function to directly support the company in meeting the goals of its approved annual operating plan. This goal will be achieved through the implementation of new major application systems, the technical upgrade of our computer facility and the addition of technical specialists in appropriate areas.'' ``We're bringing the company into the 1980s. Our company has been a little remiss in providing funding to information systems, and they've now recognized that MIS can provide them with a competitive advantage. Our mandate is to move the company off the old technology. Last year, we were basically on IBM System/38 technology. We recently installed minicomputers in each of our stores, and we're now linking them all to a mainframe. We're converting from the System/38 technology to mainframe technology.'' Computerworld senior writer Alan J. Ryan interviewed MIS managers in a variety of industries to find out what their mandates will be for 1989 and how computing can affect the bottom line. <<<>>> Title : MIS managers await RISC p Author : Rosemary Hamilto Source : CW Comm FileName: misrisc Date : Jan 9, 1989 Text: Look to nearly any computer vendor and you'll find a company that recently presented big plans for a RISC platform. But look to nearly any MIS manager and you'll find someone who doesn't care much about the flurry of RISC activity. Based on interviews with MIS executives, RISC is a technology that is not a part of mainstream computing today and will likely play a small role in corporate computing strategies in the early 1990s. ``It's not part of my long-term strategy, but it certainly has my attention,'' says Dennis Klinger, vice-president of MIS at Ryder Truck Rental, Inc. ``I want to experiment with it, but other than that, we're just watching it.'' Profitable RISC Nonetheless, RISC is a selling tool for most vendors _ from IBM and DEC to Sun and smaller companies like Mips Computer Systems. ``We see RISC as a huge buzzword, but users aren't going to buy a system just because it's RISC,'' says Vicki Brown, director of systems research at International Data Corp. (IDC) in Framingham, Mass. Analysts expect the RISC hype to continue next year as more products become commerically available and the different RISC camps challenge each other for market dominance. RISC is being touted by some as the salvation for computing because it is expected to pick up where traditional CISC processors are expected to leave off. As CISC processors reach the physical limits of how much can actually be loaded onto a single processor, RISC processors are expected to take systems to new levels of performance. ``RISC technology is really the promised land,'' says David Burdick, an analyst at Dataquest, Inc. ``We've just about run into a wall with CISC processing. If you talk to Sun, they believe they'll go up to 80 MIPS [with Sun's Sparc chip] before the decade is over. I believe that.'' RISC relies on fewer instructions than CISC processors. While a CISC processor can run up to the 500-instructions range, RISC processors are normally less than 200, Burdick says. With fewer instructions, processors can be dedicated to specific jobs, and the smaller amount of instructions can be executed more quickly. The simple design also lends itself to a multiprocessor or parallel processor architecture more than the conventional CISC architecture does, Burdick says. It is through this multiple-processor option that RISC-based systems will achieve MIPS ratings that surpass current high-end mainframes. ``Traditional architectures are reaching a dead end in terms of getting more performance,'' IDC's Brown says. ``RISC will offer a lot more scalability, and that's important.'' Not only will they bring performance improvements, RISC-based systems will be offered at lower prices than comparable CISC-based systems, analysts say. Since manufacturers will be able to build systems with simpler designs, their costs will be lower; they, in turn, can pass those savings on to users. But there is a trade-off. With the price/ performance benefits come a new platform, and that means existing software can't simply be ported to it. ``No question about it,'' Burdick says. ``The problem [for RISC] with mainframe users is all their software.'' Despite the promises of RISC and because of the drawbacks, analysts say they do not expect MIS managers to rally around the RISC cause. Managers are more concerned with the overall price and performance a system offers than with the chip on which it is built. ``We don't sit around and talk about RISC,'' says Dean Allen, corporate vice-president of information and adminstrative services at Lockheed Corp. ``We have a job to do. To the extent that it offers better cost and performance, we'll pursue it. But we're not putting a big RISC strategy in place.'' Other users express a similar view, saying they expect vendors to provide a total package and that a vendor's talk about RISC doesn't affect them. ``We don't buy boxes anymore,'' says Patricia Stadel, administrative vice-president and director of information technology at Addison Wesley, Inc., a publishing company in Reading, Mass. ``Frankly, I couldn't care less if they do it with a complex or reduced instruction set.'' Both users and analysts expect RISC to eventually emerge in mainstream computing as a component of an overall architecture. Certain processor jobs could be best served by RISC; others must continue with a traditional CISC approach. RISC will play its most critical role for applications in which raw computing performance is the top priority and jobs _ such as repetitive mathematical calculations _ in which the actual number of in- structions required is limited. As a result, analysts expect to see RISC-based systems do particularly well in engineering environments. ``RISC will become especially beneficial in the workstation area, where users are not managing mulitple operations,'' says John Logan, president of the Aberdeen Group, a market research firm in Boston. But the raw computing performance won't win many MIS managers' hearts, Dataquest's Burdick says: ``To a general-purpose MIS environment, performance means supporting users adequately.'' At CSX Technology, the information systems arm of CSX Corp., RISC has been pegged for a small niche in the overall computing strategy, according to Jack Cooper, president of CSX Technology. ``It's exciting, but it's for a selective niche for us, like a high-performance [application such as] process control,'' says Cooper, who oversees an installation of several IBM 3090s that service his transportation firm. ``We see it as an engineering/scientific technology. We don't see it as mainstream.'' Performance more important The Market Decision Systems group at Shearson Lehman Hutton in New York recently installed a Sun-4 system, a RISC-based workstation that is said to process at rates of up to 10 MIPS. But according to Gary Handler, vice-president of Market Decision Systems, the Sun system was selected because of its performance, not because of RISC technology. ``I haven't made up my mind about what is better,'' Handler says. ``I went with Sun because of performance. I wanted a demon compiler. It wasn't a conscious decision between RISC and CISC. I don't care if it's CISC or RISC or whatever, as long as they can bring me 10 MIPS in a box.'' Malcolm MacKinnon, a senior vice-president at the Prudential Life Insurance Company of America who heads up the company's MIS effort, says he views RISC as an issue for manufacturers, not users. ``My focus is for faster, better and cheaper,'' MacKinnon says. ``We're concerned with how programs can run on a platform, not about how it's built.'' By Rosemary Hamiliton; Hamilton is Computerworld's senior editor, systems. <<<>>> Title : Focusing on standards, op Author : CW Staff Source : CW Comm FileName: mcnealy1 Date : Jan 9, 1989 Text: One of the most significant trends, one that will enable computers to proliferate the way automobiles did 70 years ago, is the move toward standards and true open systems. The proprietary architectures and operating systems some companies are still promoting (and often protecting through threatened legal action) are not the right solution, since they limit the user's ability to choose. Imagine what would have happened to the auto industry if each model of car had the brake, accelerator and steering wheel in a different location. Increasing standardization will make the computer an even more essential tool in the workplace, letting us further boost productivity and rapidly exchange information. But these goals demand an even greater commitment to networks and distributed computing. One result of networking _ electronic mail _ could have an enormous impact on the workplace in the near future, as more people are brought closer together via computers. Sophisticated computers and connectivity could bring about fundamental shifts such as working at home or all-electronic purchasing. Meanwhile, as computers and telephones become more closely linked, the computer could not become only a work center but also a communication center. Computers will continue to take over time-intensive, often tedious tasks such as inventory and scheduling, leaving individuals more time for creative work. However, powerful computers running artificial intelligence applications can be used for more demanding jobs. Computer technology will continue to refashion the way we work at an accelerated pace. In time, a computer _ like an automobile _ will become such an integral part of our lives that we can't imagine working without it. <<<>>> Title : RISC: sweet music or just Author : Stan Kolodziej Source : CW Comm FileName: risciest Date : Jan 9, 1989 Text: The RISC market is filled with contradictions. RISC vendors are priming the market pump, but some observers forecast a shake-out. Analysts see a consolidation along the lines of the carnage that laid waste the complex instruction set computing, or CISC, market years ago. While vendors extol the increased processing power provided by reduced instruction set computing, MIS appears to be indifferent to such claims. If there is a risk for vendors with RISC, it is that MIS just won't care one way or the other. As usual, the truth probably lies somewhere in between. RISC might not fire MIS interest in 1989, but it's certain to change the rules of the processor game. For one thing, RISC will still represent an important entry point for new vendors, creating a new industry alongside a market polarized around the Motorola, Inc. 68000 and Intel Corp. 80386 and 80486 processor families. The list of RISC players is already long and diverse, ranging from such small companies as LSI Logic Corp. and Cypress Semiconductor Corp. to heavy hitters like IBM and Digital Equipment Corp. And there are at least a dozen different RISC architectures already in the marketplace. Even so, the RISC market is young and still undefined, enabling new RISC entrants in 1989 to gather some critical mass for their products. Yet the RISC entry price will be getting steeper. RISC hardware is not likely to be much of a problem to develop, and in many ways it represents a commoditized product, following much of the processor market in general. The software part of RISC is something altogether different. ``The [RISC] software is tough for newcomers,'' explains Michael Mahon, a system architect at Hewlett-Packard Co. ``There are things like software compilers, mapping, porting [Unix] kernels and a heck of a lot of software fine-tuning to handle. It's a deterrent.'' A conflict of strategies There are also signs that as the RISC market enters 1989, its freewheeling days will be left behind. The RISC market in fact is already closing ranks, coalescing around a set of very definite vendor strategies that could quickly pull smaller RISC companies into their orbits. Right now, much of that pulling is being done by Sun Microsystems, Inc. and Motorola. Sun is clear about its RISC strategy for 1989 and beyond: License as much of the development of its Scalable Processor Architecture (Sparc) as possible to other companies, including Cypress and Fujitsu, Ltd. and, more important, big guns such as Texas Instruments, Inc. By keeping licensing fees low, Sun hopes to attract more developers in 1989. Farming such RISC development out, Sun aims to leverage the production of its Sparc processors and create a hotbed of market competition to improve on Sparc. The end result, according to Dave Ditzel, Sun's manager of advanced CPU architecture, will be more powerful Sparc processors produced at lower costs. Eventually, Sun's endgame is to aim Sparc-based workstations at the heart of the mainstream PC desktop market. ``We want to do in the RISC market what the clone manufacturers did in the [IBM] PC market,'' explains Anthony West, director of international business development at Sun. ``We want to break it open. What we lose in licensing royalties we will more than make up through Sparc's market presence.'' Sun sees the existing workstation market as three-tiered, with Sun sitting comfortably at the low and high ends and Motorola dominating the middle. West says Sun temporarily left much of the middle ground to Motorola's entrenched 68000 series processors. The plan, however, is to put Motorola in a vice, squeezing that firm from the top with Sparc-based workstations and from the bottom with Sun's 386I workstations, which are based on 80386s, revised to run both MS-DOS and Unix. The 386Is, West declares, will only get more powerful. Different drum Motorola's RISC strategy for 1989 marches to a different drummer. Motorola plans to first create software critical mass behind its 88000 RISC processors. The more software available for the 88000, Motorola reasons, the more systems integrators will support the 88000 and find willing users. Motorola's big vehicle to accomplish this is 88open, a consortium of software and hardware vendors supporting Motorola's Binary Compatibility Standard. BCS is a set of specifications that enjoins all software written for systems built around the 88000 to use a common interface to executable or binary programs. BCS gets to the guts of software compatibility across hardware platforms and runs deeper than source code compatibility. The end result is that a program following BCS specs and written for one particular 88000-based system is virtually assured to run on another 88000 system. That will be a big plus, eliminating today's tedious necessity of recompiling programs to run on various different RISC processors. With BCS conformity, programs would theoretically bridge the various Unix versions in the market now and could even apply to non-Unix operating systems as well. Traditional CISC processors never had such software portability. As software goes, most observers agree, so goes the RISC market. ``There is no such thing as a RISC market,'' says David L. House, Intel's senior vice-president of microcomponents. ``It's a Unix market.'' Motorola's BCS is actually riding on the back of AT&T's binary effort. BCS is itself a subset of AT&T's proposed Application Binary Interface (ABI), which is part of AT&T's long-awaited Unix System V, Release 4, set for introduction in late 1989. ABI has been available to developers for some time. Both Motorola and AT&T stress mutual agreements whereby AT&T will use BCS as the basis for an 88000 ABI and 88open will try to extend BCS to full ABI compliance. The impact of ABI on the Unix/RISC world will be significant. ``I don't think people realize how important ABI is,'' claims Don Anderson, vice-president of the Advanced Systems Department at Toshiba America, Inc.'s OEM Division. ``ABI is going to give systems vendors shrink-wrapped Unix programs that run across processors,'' Anderson maintains. ``That's just what the Unix market needs.'' Motorola's efforts with BCS are putting some fright into Sun, which is planning to bring Sparc licensees together in 1989 in an effort to stabilize Sparc work around certain agreed-on architecture specs. Part of that work will involve binary compatibility across Sparc processor lines. According to Sun's Ditzel, ``As competition in the Sparc arena drives prices down, we will also develop large Sparc software bases. That's essential.'' And Mips Computer Systems, Inc., a Sunnyvale, Calif., RISC company that was recently courted by DEC, will use its Synthesis Software Solutions spinoff to help convert software from a number of companies, such as Relational Technology, Inc. and Unify, to Mips Computer processors. With the substantial presence of DEC supporting the company, software support for Mips Computer processors could surge in 1989. Motorola and Sun are running hard, and for good reason. Intel is supposed to start prowling the market with its own family of RISC processors in 1989. To confuse the competition, Intel has a number of internal code names for its RISC project, of which Race and the more prosaic N10 are only two. Whatever the name, Intel's RISC processors stand a good chance of being packaged and sold by Intel in various system configurations right from the start, targeting these RISC systems for specific applications _ something Intel has been doing more of recently with its 80386 line. The initial push for RISC to enter the mainstream commercial market will be as competitive replacements for high-end PCs, especially as networked workstations and powerful file servers _ the latter usurping a traditional role of minicomputers. Don't hold your breath, however. Despite the prediction of Sun that RISC will rapidly enter mainstream commercial computing, chances are the markets RISC will strike at in 1989 will not vary greatly from the traditional RISC bastions of the scientific/engineering and computer-aided design and engineering fields. Though Fred Rehhausser, Motorola's VME product planning manager at its Microsystems Austin Design Center in Texas, says 1989 will represent explosive growth for the RISC market, Rehhausser adds that his company is also under no illusions as to where the bulk of that RISC revenue will be derived for some time to come. ``RISC will be used in the engineering and scientific communities, and especially in real-time embedded control applications where software compatibility is not such an issue,'' Rehhausser explains. ``But we do see some niche applications in commercial computing, such as AI. That is where support from relational database vendors becomes important.'' Intel's House agrees. ``RISC is being overpositioned,'' he says. ``RISC will be immensely important in embedded applications, but we don't see it taken that seriously in the commercial marketplace. This year [1989] will begin to see a repositioning of RISC.'' If it happens, HP hopes the repositioning isn't that severe. The company, after all, just introduced its HP-PA, a new line of minis based on its RISC Precision Architecture, and the HP-PA is aimed right at commercial computing applications. ``The cost of ownership of RISC machines is too attractive now to ignore in commercial applications,'' maintains HP's Mahon, who adds that 1989 will be the year RISC computers make a splash as network file servers, or at least that HP machines do. The more the merrier? There could be other potential RISC vendors to watch for in 1989. Apple Computer, Inc. is said to be seriously eyeing the market, and DEC is expected to introduce its own RISC machine based on Mips Computer technology sometime in the new year, amid speculation that the Maynard, Mass.-based giant might also have quite a time trying to make the new RISC machines binary-compatible with its existing proprietary VAX processor line. Where would a market be without IBM? The Armonk behemoth is definitely a RISC vendor to watch in 1989. Although on the surface IBM's RISC strategy might not seem as clear as some of its competitors', don't be deceived. Until recently, IBM's RISC work seemed to be on the fast track to nowhere. IBM's RT workstation line was faltering, though some recently upgraded models are being more warmly received. A good part of how IBM is destined to get its RISC up to speed, however, is through its affiliation with the Open Software Foundation (OSF), a consortium of vendors opposing what they see as AT&T's Unix stranglehold. IBM somehow convinced other OSF heavyweights to use IBM's AIX Unix derivative as a base on which to build a new Unix software infrastructure. Since AIX was optimized for IBM's RTs, the OSF agreement can only breathe new life into future RT sales. And given the tight link between Unix and RISC, the OSF will make IBM a RISC player to contend with. Ironically, IBM, which prides itself on its technical and marketing acumen, could in the end wind up getting much greater RISC mileage through the likes of OSF power brokering than it could through technical and marketing skills alone. (For similar reasons, 1989 could be a good year to keep an eye on a company like Apollo Computer, Inc., which is busy creating a software political infrastructure that resulted in IBM's agreement to include key parts of Apollo's Network Computing System software specs in AIX. Such a compact could boost Apollo's sagging sales for its Prism RISC-based workstation line and make the company a factor in RISC software. For these reasons and others, 1989 could prove to be the year that software politicking becomes a definitive art form.) Shake-out? Most observers say don't look for dramatic changes in the RISC market in 1989, but they also don't discount the possibility of a shake-out looming just beyond. If new vendors enter the RISC market and if MIS stays cool to RISC, there could be trouble in what some call an already crowded RISC field. When too many processors chase too small a market, shake-out is inevitable, says Andreas Schreyer, Motorola's RISC line manager. ``A decade ago there were about 10 different CISC architectures vying in the market,'' Schreyer says. ``Soon there were only Intel and Motorola. ``There are now about a dozen RISC architectures; to me, that says the writing's on the wall.'' By Stan Kolodziej; Kolodziej is a Computerworld Focus on Integration senior editor. <<<>>> Title : The net manager's apprent Author : CW Staff Source : CW Comm FileName: sorcerer Date : Jan 9, 1989 Text: All right, so I couldn't resist. I admit that I took my two kids to a rerun of Walt Disney's Fantasia for reasons of nostalgia. When I was a kid, Mickey Mouse was the best character going. In Fantasia, which I've seen a dozen times since, Mickey got to play the Sorcerer's Apprentice. That was the fellow in the robe and sorcerer's hat who summoned his master's magical spell to get his housework done by legions of brooms. Well, I, too, had been swamped with housework that weekend, and I guess I must have dozed off during the Sorcerer's Apprentice part. Soon I was having my own nightmare _ but this was about telecom standards, not brooms. Here's how it went: In my dream, I was the telecommunications apprentice of a master network manager. I worked at a corporate office in the northern Chicago suburbs, having recently been transferred from computer ops. Anyway, it didn't take long until my boss was called out of town for an emergency conference of the CCITT X.700,000 committee in Geneva or Brussels or somewhere. It seems the international standards folks had to decide on a few things before the end of the year. As he threw on his coat, the old man shouted some instructions over his shoulder. It sounded like, ``Don't worry, be happy,'' but I later found out it was ``When in doubt, use TCP/IP.'' Good advice for a mixed-vendor network. Well, the boss was probably winging his way over the Atlantic when I got my first call on the Help desk. Seems like one of our Chicago end users wanted to access a European database but didn't know how to use the CCITT's X.75 internet standard. ``No problem,'' I said and rushed over to the stack of communications manuals gathering dust on the old man's file cabinet. Turning the dog-eared pages of one of the manuals, I discovered something I had always suspected. These standards had multiplied like bunnies. All the familiar CCITT X. standards were there, along with the well-known IEEE series. There were others, too _ ones I'd never even heard about. Communications standards were not the only ones that had proliferated. A second manual, this one on software, outlined the differences between AT&T's Unix System V, Xenix and IBM's AIX. While I was thumbing through the manuals, I happened on a pink-colored cheat sheet in the master's own shorthand that summarized the differences between the standards. ``Wow,'' I thought to myself, ``Even he can't keep them straight.'' Fortunately, my caller soon abandoned his quest for the data stranded in that far-flung European database. Seems my Chicago end user couldn't get his log-in sequence straight, anyway. Relieved that my first close call was over, I got some coffee at the Help desk's coffee station. Then I returned to the network manager's cheat sheet, which amounted to a list of macros that sorted out kinks in our firm's global network. Just as I was memorizing my first macros, the Help-line phone rang again. By this time, it was late evening, and Tokyo had just come on-line. Seems the Japanese office wanted to know how to transfer a Unix file to my boss' MS-DOS-based PC. ``No problem,'' I said. ``Just tell me which version of Unix you're using, and I'll call up a specialist.'' It would, indeed, take a specialist to understand all the variations of Unix on the market today. So I called up my Uncle Don, who had worked at Bell Labs before Unix was born, and he told me a little bedtime story: ``Once upon a time, Bell Laboratories gave birth to a new operating system called Unix. I've heard his daddy was AT&T and his mother was a PDP-3 minicomputer. ``In his early years, Unix was the favorite plaything of the scientific community,'' Uncle Don continued. ``But when he was older, Unix got a California cousin, a rival called the University of California at Berkeley Unix 4.1. ``Strange thing, though,'' Uncle Don said. ``When Berkeley 4.1 grew up, they started calling him 4.2 and 4.3 _ and I hear he doesn't look a thing like Unix System V. But I did hear that Berkeley's father, Bill Joy, is trying to conduct a genetic engineering experiment _ something about merging the best traits of Berkeley 4.3 and AT&T's Unix System V. Kind of a test-tube baby.'' Get to the point ``What's the point of this story, Uncle Don?'' I asked, as I was under a good deal of deadline pressure. ``Well,'' Uncle Don said, ``some people didn't like Unix's father at all. That would be AT&T, who lives in New Jersey. Then, in 1988, another family wanted to adopt Unix. They called themselves the OSF _ but what that stands for, I don't know. Right now, they're preparing a nursery for the new Unix offspring, but they won't show anybody the wallpaper or the carpet _ and they won't grant Daddy AT&T any visiting rights.'' ``But what can I do to help my Tokyo end user right here and now?'' I asked. ``Tell him it's morning in Japan,'' Uncle Don advised. ``Your Japanese user should go out to the local computer store and buy a version of Unix your Chicago PC can accept. He's going to have enough trouble sending the darned files halfway around the world. Thanks for the call, but I've got to turn in myself.'' That was Uncle Don's way of signing off. After my bedtime story, I realized with no little horror that everyone else had gone home. ``Please call back later,'' I told the Japanese caller. Anyway, with all those foreign standards to learn, I wasn't ready to learn about Kanji characters, too. OK, OK. Things had gotten a bit crazy, but I was still in one piece. The machines were humming. The Help desk phones weren't lit up. That's when I saw it _ MY FIRST BIG PROBLEM. The oversized IBM Netview screen was display ing a Systems Network Architecture system alert I'd never seen before. ``Oh no!'' I said. ``I wonder if it'll go away by itself.'' What I didn't know, or couldn't have known, is that Netview was better prepared for the task of network management than I was. It already knew which things to worry about and which ones not to worry about. But it was also patiently trying to tell me that a major link had gone down in Texas somewhere. Seems a tornado had ripped through central Texas, around Austin. That's when the Help desk hot line started doing its impersonation of a Christmas tree. And that's precisely when I could have used Mickey Mouse's secret formula to multiply my hands or fingers to handle all those phones. Mickey's formula seemed to work well enough on brooms. ``Network hot line. Can I help you?'' I forced myself to say. ``Hey, who is this, anyway?'' the caller demanded. ``We're in Austin, and we've got to hook up to Waco's VAX. That darned tornado has ripped the roof off our office, but the show has to go on. Our accountants say so. What's worse, our IBM mainframe doesn't even have a DEC-SNA interconnect package!'' For a fleeting moment, I heeded my network master's last-minute advice. ``Why don't you try using TCP/IP?'' I ventured. But the howl from the wind that was plastering computer printouts against the Texas data processing shop's walls drowned out my weak suggestion. Well, that Waco plant was one of our biggest factories, and without orders from the corporate IBM mainframe, the entire operation might be forced to close down. Just then, I remembered my boss' cheat sheet and cooked up a log-in sequence that probably had never been tried before. It's no surprise that it didn't work. Instead of connecting the two Texas sites, I'd inadvertently gotten the Texas users logged onto that Japanese mainframe. Boy, was I going to be in trouble when the old man got back from Europe. That's when I tried patching the two Texas circuits together by hand. Not only didn't this work, but I shorted out the modem rack so badly that sparks started flying everywhere. I hoped the flashes wouldn't land on some errant piece of dust behind the rack and start a fire. In the nick of time Just then, the door flew open. I didn't know it yet, but help had arrived. I dove under a swivel chair to stay out of sight. It was 7 a.m., but the new day wasn't getting off to a good start. What began as calls from the stricken Texans had mushroomed into angry calls from coast to coast. New Netview messages were coming on the screen, fast and furious. To my great relief, in walked the old man. Seems his plane never took off from snowed-in O'Hare. He had spent the entire night trading networking tricks-of-the-trade at O'Hare's Skyline Bar with a French communications specialist from IBM's LeGaude Lab. The former member of the French Resistance got a big charge out of the international standards movement, it seems, because it was really a nice way to fatten up his research and development budget. He's still working on making SNA conform to the Open Systems Interconnect. The old man surveyed the damage, glancing from the smoking 19-in. racks in the back to the blinking red lights on the Help desk phones. ``Go back to computer ops, where you came from!'' he growled. ``And give me back my cheat sheet.'' With that, he got on the phone with Japan, read them a few lines of code and hung up. Another problem solved by the network master. I guess I'll never really know what I did wrong or right that night in the network center. All I know is, I'm glad I'm not a network manager anymore. I'm especially glad I didn't have to memorize all those standards. And mostly, I'm glad that the stirring music of Fantasia jarred me out of my dreams and landed me back in a Chicago movie theater _ where I belonged. To borrow from another great flick, ``There's no place like home. There's no place like home. There's no place . . . '' By Jean S. Bozman; Bozman is Computerworld's West Coast bureau chief. <<<>>> Title : Fast speeds, slow expecta Author : William Brandel Source : CW Comm FileName: billfore Date : Jan 9, 1989 Text: If the adage is true that anything worth doing is worth overdoing, the personal computer market has found its motto. Now the market's task at hand is to rediscover its mission. The PC market appears to be cooling after a long period of rapid growth. Technological developments and manufacturing advances are allowing vendors to push the PC performance envelope. But now the question that MIS asks is, How much PC is too much? Like the auto industry, the PC business is driven by consumer and business technology. PC makers can benefit from something the automakers learned the hard way about power hunger. As Detroit ground out luxurious gas-guzzlers that were rivaling artillery tanks in size, the possibility of an oil crisis that would bring U.S. auto manufacturers to their knees was as remote as the possibility of a memory chip shortage wreaking havoc in the PC market in 1988. The oil crisis forced the auto companies to downsize their designs and their expectations. Are PCs heading down a similar highway as Detroit? Currently, powerful Intel Corp. 80386 processors are hitting the market, and the more powerful 80486 is just around the corner. Meanwhile, MIS is still figuring out how to get the most out of the adequate 80286 machines. ``There's not enough need for the amount of PC we're presently using,'' says Don Whittington, MIS director of the Michigan Sugar Co., based in Saginaw, Mich. But Whittington notes that his company is using PCs based on the Intel 8088 processor _ not a high-powered processor by anyone's standards. ``Our people are just using them for word processing and spreadsheets, and they can only do so much of that,'' he says. Michigan Sugar is an example of what is occurring in many businesses. As it is the norm to automate in business today, it is also the norm to upgrade. But because of long-range budgeting and expected PC advances, many companies own more PC power than they need. And while underutilizing their PCs, employees also waste time trying to figure them out rather than doing their jobs. But as the out-of-control market is now being tamed by corporate budget restraints, MIS is facing different PC decisions. ``Days of rampant buying are over,'' says Bruce Stephen, a PC analyst at International Data Corp. (IDC), a Framingham, Mass.-based market research firm. As a result, ``Companies are becoming much more conservative with how they use PCs. ``PC products will not repeat the success of the E.T. video,'' Stephen says, alluding to the videotape sales blitz that occurred as soon as the popular movie was released into the home entertainment market. ``The PC market has been saturated with technology since the PC was first introduced. MIS budgets are tightening, not getting larger, and PC managers are being very careful about how they implement, study, gather evidence and make sure products are deliverable before they commit to any type of buying.'' Stephen's analysis reinforces what other analysts and MIS officials believe is a new phase of PC organization. Managers are reassessing the PC's role in their businesses. ``Forthcoming PC technology will be both good and bad,'' says Richard Kulper, MIS director at Sulzer Bingham Pump, Inc., based in Portland, Ore. ``But if not planned properly, the PCs will come back to cause problems. And the hardware strategy cannot be controlled without spending a lot of money on [employees'] software.'' Controlling the PC environment for a business includes estimating the amount of power required to perform the tasks that users need. With the new age of PC computing comes new budgeting priorities. Some companies are centralizing purchases and basing them on brand names; other companies purchase every product upgrade. Often, the needs of the users are overlooked. One MIS manager notes that most of the PCs in her company are not utilizing the resident hardware and software. ``Corporate wants us to use Dbase IV and asked me if I needed it or not,'' says Melissa Broadway, a data processing coordinator at Lomas & Nettleton Financial Corp. in Dallas, Texas. ``I think we should look at Dbase III Plus and see if that is being used to its full capacity yet. We were just getting to use III to its full capacity when it was upgraded.'' Catch-22 MIS and industry analysts concur that upgrading is overrated. In most cases, they say, productivity is not increased but depleted, and a catch-22 between ``more productive'' software and ``higher performance'' hardware begins. For example, Michigan Sugar's Whittington notes that when a higher performance PC is brought into the office, the end results have worked against the stated intent: to make the employee more productive and useful to the company. ``Some people start playing with their PCs, trying to get them to do more for them, and they are usually the ones who be come less productive,'' he says. ``Most of the things that our people are using PCs for can be done faster by hand,'' Whittington adds. The issue comes back to where it started. Is the PC the means or the end? The emphasis, MIS directors say, should once again be on end users and what they are trying to accomplish with their PCs and making it easier, not more complex, to use the machines. Familiarity breeds contentment So back at square one, MIS calls on an old friend. Many managers say they believe that the strongest feature that drives PCs is user familiarity with the operating system _ DOS, in most cases. ``Why switch to OS/2 if you are comfortable with DOS?'' asks David Carnevale, vice-president of microsystems research at Infocorp, a Cupertino, Calif.-based market research group. Carnevale's group analyzes PC user trends. ``It is basically against human behavior to change when you are comfortable with an operating system unless something comes out that is important enough [to warrant a change],'' he explains. MIS managers maintain that they are hard-pressed to find something brilliant that is new in the PC market _ or anything new enough that would move people to change their work habits. Under this logic, users do not have much cause to switch to OS/2. The major impediment to OS/2 is simple: There is no demand for it. Not many applications are available that can run on it, and users say they do not want to switch from DOS. OS/2 is difficult to use without the Presentation Manager, a graphical user interface co-developed by IBM and Microsoft Corp. The complete system is also expensive to use: Carnevale expects a fully loaded OS/2-running machine to start at around $14,000, about $12,000 more than it costs for a basic IBM compatible running DOS. Analysts and MIS managers are scaling back their views of the demand for OS/2. Figures from Dataquest, Inc., a market research firm in San Jose, Calif., indicate that despite the Presentation Manager's Oct. 31 introduction, affordable applications that are compatible with it will not be out until 1990. Both Dataquest and IDC do not expect user demand for OS/2 to exceed DOS until that year. DOS will continue to be strong into the 1990s, IDC's Nancy McSharry says. Seemed like a good idea So why OS/2? Analysts and MIS managers say that it seemed like a good idea before it arrived. But once in the office, it's a horse of a different color. Because of human psychology, users look at IBM's Video Graphics Array and Enhanced Graphics Adapter screens, and suddenly Color Graphics Adapter screens look funny. You look at a laser-page printout and dot matrix does not look so good. Word processing performs better with a faster CPU. Which brings the argument back to the machines that run this software. MIS Director Broadway explains her dilemma. ``We [were] looking at PS/2s, which I think are an interesting idea,'' she says. ``But then there's software. More and more, our users are becoming familiar with the term `vaporware.' We were interested in OS/2, but the enthusiasm dwindled after it came out. We were also at first excited about PS/2s, with all the memory advantages without being physically larger machines.'' But Broadway says the excitement about the PS/2s waned when she discovered problems trying to connect the machines to the mainframe.``So we turn back to the XTs and ATs instead,'' she added. ``How do you recover your loss?'' Broadway says that if the PS/2s cannot be integrated into the mainframe environment, they become a ``dead issue,'' or virtually useless for her integration purposes. ``I guess I'll have to find somewhere in the company to put these PS/2s.'' ``The PC market will experience specialization, a segmentation by brand into different product applications,'' IDC's Stephen predicts. ``With stand-alone desktops running on Intel 80386 processors, the market cannot really equate these products as standard personal computers.'' Most analysts say they agree that these machines, currently based on the 80386 processor and soon to include the 80486, are a bit much to put on one individual's desktop. Another factor that enters into the overall picture is the desktop cost structure. IBM Personal System/2 Model 70 and 80 machines can range anywhere between $6,000 and $10,000. Stephen estimates that machines built around the Extended Industry Standard Architecture, an alternative bus built by IBM's Personal Computer rivals to compete against the Micro Channel Architecture (MCA), will start in price at the $12,000 to $15,000 range. But currently, the majority of the million-plus MCA-based PS/2s at user sites are in stand-alone configurations. This means that most of these users are operating with, on average, $8,000 of hardware alone. Considering that few software or hardware products are available to take advantage of the PS/2 architecture, it is a safe assumption that these machines are currently not being utilized to their full potential. Common sense prevails But even in the midst of some very confusing predicaments, there is a light in the darkness: common sense. As companies try to be more competitive by using technology, MIS must consider that employees are using the PCs as tools, not to do their jobs for them. Common sense calls for MIS to deliver PCs that are appropriate for the users' tasks and to ensure that the machines can be integrated into the larger system. If the PC cannot be used to assist the employees' productivity, the purchase works against _ not for _ its business purpose. By William Brandel; Brandel ia a Computerworld senior writer. <<<>>> Title : PCs not yet part of CAD s Author : Julie Pitta Source : CW Comm FileName: worka Date : Jan 9, 1989 Text: Personal computers are slowly making inroads in engineering departments as tools for less demanding applications. However, Unix-based engineering workstations continue to be the machine of choice for serious design work. ``I think you can probably make any high-end PC into a workstation,'' says Alan E. Holley, principal consultant for engineering computing at Hughes Aircraft Co.'s Ground Systems Group, the largest of the six groups that make up Hughes. ``But what machines are actually purchased tend to be based on what people have been used to using in the past.'' Hughes is using Sun Microsystems, Inc. workstations to perform computer-aided software design for state-of-the-art defense systems. PCs so far have been reserved for administrative tasks. As with many engineering departments, engineers at Hughes tend to use two types of machines _ Suns for design work and a PC to run traditional applications such as spreadsheets, word processing and database management. Jeff Ehrlich, MIS director of General Electric Co.'s Medical Services Group, buys every kind of PC imaginable for his users, from Tandy Corp.'s IBM Personal Computer compatible to Apple Computer, Inc.'s Macintosh II. However, the engineering group has not adopted the PCs brought in for its design work. ``Sun is very strong with the engineers,'' Ehrlich says. Not even the Mac II running A/UX, Apple's version of Unix, has persuaded GE's engineers to give the Mac II a try. This is despite Apple's efforts to promote the Motorola, Inc. 68020-based Mac II as an engineering workstation and the proliferation of Macs within GE, which is buying more Macs than anyone right now. Ehrlich says the problem with the Apple approach is A/UX, which lacks the speed of the Sun operating system, its base of applications software and support for the X Window System environment. Part of the resistance to PCs stems from elitist engineers who want the fastest, most powerful machines available. However, their bosses are recognizing the economy of a PC vs. a traditional engineering workstation. An entry-level engineering workstation from Sun _ considered the leading vendor _ or its nearest competitor, Apollo Computer, Inc., costs about $20,000. An Intel Corp. 80386-based machine from Compaq Computer Corp., considered the leader in 386 technology, sells for about half that price. As a result, industry watchers say that more PCs are being purchased as engineering tools. The 386 microprocessor allows PCs to offer a more powerful hardware platform for Unix. However, the 386 still lacks the speed of a 68020. While a 386 running Unix is not suitable for complex applications such as molecular modeling, it is suitable for two-dimensional design work. ``An 80386 running Unix is absolutely a workstation,'' insists Dave Burdick, a vice-president at Dataquest, Inc., a San Jose, Calif., market research firm. ``If they can use the disk as a rotating memory device, then they can run large engineering programs. You don't need to spend $50,000 to do CAD,'' he says, referring to computer-aided design. Burdick's colleague Bill Lempesis, a PC industry analyst at Dataquest, says more PCs bundled with Xenix, Microsoft Corp.'s version of Unix, are being sold than ever before. Not there yet Both Burdick and Lempesis say 386s, like the Mac, have a long way to go before becoming popular with engineers. ``The 386 running Unix is still an immature environment,'' Burdick says. ``There isn't very much application software written.'' Lempesis adds, ``PC vendors haven't positioned their machines as engineering workstations.'' One 386 that has been positioned as an engineering workstation is the Sun 386i, introduced earlier this year. The 386i runs Microsoft MS-DOS and Unix. Sun has not released shipment figures on the new system. In addition to prejudice, PC vendors must overcome some physical obstacles, Burdick says. Large corporations have years' worth of data stored in minicomputers and mainframes that must be accessed to perform engineering tasks. Workstations provide the better link to mainframe applications, Burdick maintains. ``The large companies are tied into their workstations,'' he says. ``They've built up years of data that they aren't willing to give up. PCs just don't offer the kinds of connectivity and compatibility to these machines.'' By Julie Pitta; Pitta is Computerworld's West Coast senior correspondent. <<<>>> Title : CASE fights to beat `all Author : Nell Margolis Source : CW Comm FileName: casemore Date : Jan 9, 1989 Text: CASE, at this point, is a lot like teenage sex,'' says an enterprising CASE consultant. ``Everybody's talking about it; few are actually doing it; and even fewer are doing it right.'' While some might argue with his analogy, few involved in the implementation of computer-aided software engineering (CASE) _ whether marketing tools, consulting with would-be users or attempting to join the ranks of CASE users _ would debate his point. A study, among the first of its kind, conducted by Bellevue, Wash.-based consulting and market research firm CASE Research, surveyed commercial, mainframe-oriented MIS shops at large companies across a broad spectrum of industry sectors with regard to CASE implementation and intention. The relative dearth of active CASE work, coupled with a low level of what the report called ``CASE preparedness,'' led the report to conclude that ``the market as a whole is in a `pre-CASE' environment.'' CASE Research's findings were confirmed last month by Computerworld (see story page 50). What factors are retarding the implementation of one of the hottest technologies to flood media pages and market shelves in the past several years? And how likely are these barriers to be overcome in the coming year? The short answers appear to be ``many obstacles'' and ``very likely _ at least to some extent.'' What is CASE, anyway? One of the first (and worst) problems facing CASE, says CASE Research Chairman Vaughan Merlyn, is that the very attention the technology garners is confusing users, not only about which tools do what for whom, but also as to what CASE is, or is trying to be. ``There aren't very many people who can explain the promise of CASE tools in two or three sentences,'' admits Anthony I. Wasserman, chief executive officer of San Francisco-based tool vendor Interactive Development Environments (IDE). A major reason, he says, is that there is a plethora of alluring products, but there is no unifying or simplifying concept for the technology. ``There's a tremendous hype/reality gap,'' Merlyn says. This has two insidious effects. First, hype has inflated expectations, which lead to early frustration when individual CASE tools or approaches do not turn out to be overnight cures for whatever problem the user hoped to abate. Second, Merlyn explains, the hype deluge is leading users to aim CASE at the wrong problems. ``The hype is about productivity,'' he says, ``but the real issue is quality. CASE teaches you how to do things better _ not how to do the wrong things faster.'' The result of this confusion is predictable. ``Many firms,'' IDE's Wasserman says, ``have taken a very short-term view of what's out there. They rush to get a tool to evaluate on a specific project without assessing how either the tool or the project relates to overall software development needs, methods and goals.'' Education is the apparent cure for this problem, and, not surprisingly, explaining CASE is rapidly becoming a market in itself. ``When companies bring in CASE out of excitement without any organizational preparation, the tools don't fit into the corporate culture and usually become shelfware,'' says Dave Sharon, president of CASE Associates, a consulting firm based in West Linn, Ore. Consultants are taking advantage of a major income source by providing CASE services including auditing companies for CASE preparedness, Sharon says. Self-education _ becoming a canny consumer _ is also a good idea, claims CASE consultant and newsletter editor Gene Forte. For instance, ``Look for experienced vendors. Ask: Does this vendor have a users group? Who is invited to attend its meetings? What do they speak on?'' Also on the increase, according to many users and would-be users, is education at the peer level. ``This user-group thing has really taken off,'' says Kurt Wagner, manager at Arthur Young in Seattle, upon returning from the first nationwide organizational meeting of the CASE User's Group, an association spearheaded by CASE Research. The seedling association started with one regional chapter when the conference began early last month; by the time the attendees departed, nine regional chapters were formed. ``There's tremendous interest in this,'' Wagner says. ``Until now, there's been no monthly forum with a working-session feel to it, where CASE users could exchange information and experiences. Almost everything you read about CASE is put out by a vendor.'' CASE of resistance ``CASE is a tremendously hard sell,'' says Judith Martin, vice-president of the First National Bank of Chicago. One of the major factors making it so, Martin explains, is the Pandora's box nature of CASE. ``At every step, as you implement methodologies, new questions and problems appear _ you need more understanding, more information, more management,'' she says. For starters, Merlyn says, ``Programmers feel threatened by CASE.'' They are ill-educated to feel otherwise, he states. ``Throwing CASE at programmers is like saying to a nurse, hey, we've got this great computer-aided brain surgery software that we want you to start using tomorrow.'' Adds Merlyn, ``Just because you have the tools doesn't mean that you have the training that makes you able to use them.'' Even those who do not feel threatened are often unprepared to discipline themselves to the rigorous methodolgies that underly CASE. ``The mind-set of software developers is a major barrier to CASE,'' says Wayne Sanford, director of contract service systems development at Bath, Maine-based Bath Iron Works Corp. Software developers, he says, often ``don't take an engineering approach _ which is what CASE relies on. They take an artsy-craftsy approach, where everyone has his favorite bag of tricks. Many aren't skilled in formal methodologies; they have to pick them up by osmosis.'' Ironically, understanding does not always break down the resistance to CASE. To the contrary, as users realize the time commitment inherent in what amounts to a revolution in the way software creators think and plan, they often balk. The full transition to working CASE, says Lee Stevens, a Corporate Engineering and Technology engineer at Pitney Bowes, Inc. in Norwalk, Conn., ``isn't the two days that some would lead to to expect; it isn't the six months that some would imagine; it's more like two years. The decision to go with CASE should be made with the recognition that it's going to be around for a while.'' Unfortunately, ``IS organizations like to see a lot of value up front,'' says Woody Blaylock, marketing and sales manager at Owens-Corning Fiberglass Corp. in Toledo, Ohio. Performance Resources, Inc., a software design consulting firm in Falls Church, Va., recently surveyed 19 CASE user companies on the question ``What must systems/DP organizations do to successfully implement CASE tools?'' According to the ensuing research report, survey participants, who ranged across a broad spectrum of businesses, ``overwhelmingly reported that they had unrealistic expectations concerning what the tools could do for the organization. This is particularly true for the short term. In fact, there may be no measurable short-term benefits.'' However, the report says, most companies did conclude that CASE held promise of significant long-term gains. Problems still possible Even assuming that all the other hurdles are overcome, problems can arise when it comes to deciding who will manage a new CASE system. ``We went through all this in the database area some years ago,'' Merlyn reports. ``We gave database management to programmers: at best, we got administration; at worst, chaos. It took 16 years to come around to the independent function of database administrator.'' Consultants and, to a mounting extent, users themselves are evolving numerous ways to cut through the seemingly interminable problems that add up to CASE resistance. Simple awareness of the problems can take potential users a long way, particularly when it comes to the cultural and political aspects, says CASE Associates' Sharon. ``Be sure you understand your parents,'' he says. ``Know where you came from. If you don't respect the 20 to 30 years of evolution that went into your corporate culture, you're going to get nowhere with your attempts to implement CASE.'' CASE implementation ``will follow as long as there's training,'' says Pitney-Bowes' Stevens. Stevens' own company was involved in a combination approach _ hire some outside experts, but develop in-house training capability as well. The inside development, he says, is critical. ``If you depend 100% on outsiders, you're in trouble because when those people are gone, they're gone.'' Not to be overlooked, most CASE observers agree, is the simple fact that time is on CASE's side. CASE is in its infancy. The more that is known, written and especially discussed user-to-user about the CASE implementation experience, the more the right questions will be asked _ and answered. Says Arthur Young's Wagner, ``We're trying to do in a few years what it took 100 years for civil engineering to do. The whole CASE brouhaha might be the catalyst we need to get us to start acting like engineers and stop acting like an arts-and-crafts profession.'' ``Price is a big issue in CASE implementation,'' says Jim Stuart, data administrator at Puget Sound Power & Light Co., an investor-owned utility company based in Bellevue, Wash. ``Any MIS director is going to choke at seven figures for a tool to help programmers _ especially when you're talking about another half a million for a repository and $10,000 for each 386-type workstation.'' Already sold on CASE as a concept, Bath Iron Works made its initial tools decisions based on functionality and price, Sanford says. It opted for a package from New Haven, Conn.-based Cadware that met the Maine shipbuilder's needs at one-quarter to one-third the price of the current market leaders, he claims. ``You can't justify spending the kind of money the market leaders are looking for for something unproven,'' he says. Not everyone, however, agrees that price is, or should be, a barrier to getting on with CASE. ``Price isn't in my list of the top five problems facing CASE, or in the top five in any survey I've ever seen,'' Case Associates' Sharon says. Deceptive pricing For one thing, he points out, the lofty prices listed by the market leaders are highly deceptive. Aggressive discounts that apply in most seriously undertaken implementation situations make the prices a lot lower than they look. Sharon speaks scathingly of companies that set out to capture CASE market share based on drastically lowered prices; to emphasize the up-front cost, he says, is to lead with the wrong edge. ``For a large company, CASE implementation becomes an integration issue right at the outset,'' Stevens says. ``If tools don't have an open architecture, we won't even consider them.'' Tool integration must reach beyond data exchange, says Sharon. ``Even if tools exchange data, I still might want to fine-tune,'' he says. ``The question becomes, How open is your architecture?'' Possibly because the integration problem looms so large, so early in the game, solutions are already pouring forth from a variety of sources. Individual tool vendors are increasingly aware that an isolated CASE product is less likely to sell. Many CASE observers say they believe that true integration will be achieved only when standards are imported into CASE. ``No one vendor has the whole solution,'' says Merlyn, who considers lack of standards a definite entry on any list of the leading barriers to CASE implementation. ``And even if they do, it's the whole solution from their viewpoint, not from each individual user's. Integration is highly personal; it has to be one-on-one,'' he claims. Such a need, Merlyn adds, can be met only by standards for such standbys as repositories, human interfaces, data types and methodologies. Hurdles The barriers in the way of broad-based CASE use are many; the emerging means of surmounting such barriers, however, are even more. It is a good thing that they are, for there is a widespread view that the CASE debate has shifted from ``whether or not'' to ``how and when.'' ``Fortunately or unfortunately, depending on your view, most of the major systems development technologies are rapidly changing and adopting a CASE perspective,'' CASE Research concluded in its 1988 Annual Survey. ``Also, information management technologies are no longer considered `back office' functions; they are of strategic importance to the competitive survival of businesses.'' These factors, CASE Research reports, lead to the recognition that ``CASE must be exploited now. You simply cannot afford to wait.'' By Nell Margolis; Margolis is a Computerworld senior writer. Computerworld Editorial Advisory Board Survey <<<>>> Title : How computers are changin Author : CW Staff Source : CW Comm FileName: blumen Date : Jan 9, 1989 Text: Amidst the economic uncertainty of 1989 and beyond, the top priority for management will be to increase the pace and quality of innovation and, therefore, the competitiveness and profitability of their enterprises. In company after company, CEOs and CFOs express the conviction that the key to achieving this objective is to exploit the use of information systems. These business leaders talk about their keen interest _ and at times their impatience and frustration _ first in understanding and then in optimizing the payback they are getting from investments in computer and communications systems. Consequently, the predominant trend in information systems in the future will be that business executives will take a much greater interest and personal involvement in making decisions about information systems. Moreover, CEOs and CFOs realize that the very concept of ``payback'' needs new definition in the world of the '90s. While the issues of whether a computer system is doing its job efficiently and cost-effectively will still be addressed, a more important issue will be whether a company's computer systems are being used for the best possible purposes. In this broader context, the cost of an opportunity lost because of a bad choice of computer systems must be measured in terms not only of lost profits, but of lost competitive position or market share. As a result of this more activist analysis and management involvement, some of our industry's oldest myths and assumptions will be challenged by the fresh candor of senior managers. They will ask such questions as: Is the best system the cheapest one or the one whose total productive lifespan is greater than a competing system? Why standardize on one proprietary operating system when open systems based on standards and hardware-independent applications give us so much more flexibility? Why is our application backlog always so heavy? And which vendor can help us solve the problem most effectively? Moreover, the rapid advancement of information technology will not only solve formerly unsolvable problems, but challenge management to find innovative, productive and profitable uses for these powerful systems. In 1989 and beyond, four technology areas, in particular, will account for most of the growth in the information systems industry: Mainframe systems will play revitalized, innovative roles in implementing a company's competitive strategy. Large, highly fault-tolerant systems will provide real-time transaction processing; but in addition, they will more and more be the ``super servers'' of vast distributed systems of PCs, workstations and microcomputers _ thus ensuring corporate database integrity and providing powerful network management services. Open systems based on Unix System V will provide a new level of cost-effective, vendor-independent computing and flexibility that will directly challenge management to automate for immediate productivity gains, as well as create applications to generate new revenue streams and profits. Software application design, generation and maintenance tools will be used to eliminate application backlogs. What is more, these sophisticated programming tools will increasingly make the difference among operating systems transparent to users by allowing them to transfer applications with ease from one system over to another. Finally, networks capable of moving multi-media information instantaneously around the world will challenge all parts of an organization to use information for real-time competitive advantage. The availability and power of these information systems will continue to give managers more tools and options for identifying and capitalizing on new opportunities. The contribution these systems can make will play a fundamental role in business strategy and in the way managers plan for greater competitiveness. <<<>>> Title : CASE jury still out Author : Nell Margolis Source : CW Comm FileName: caseside Date : Jan 9, 1989 Text: When one consultant definitively stated that the high cost of CASE tools ``isn't in my list of the top five problems facing CASE, or in the top five of any survey I've ever seen,'' he hadn't seen Computerworld's _ where it is No. 1. In a survey completed last month, 60.5% of the 154 top MIS executives who participated in a study conducted on behalf of CW targeted the expense factor as an entry barrier to CASE. In addition, 18.4% listed the expense of training as a reason for holding back on CASE implementation. Other than the warning signals sent out by the dollar sign, however, the survey results implied that the jury is still out when it comes to CASE. The benefits of CASE have yet to be demonstrated, 50.6% of the respondents said; 38.8% pointed to too many tools and too few standards as a disincentive toward trying CASE. Moreover, 54.4% strongly or fairly strongly agreed with the statement, ``Not enough information has been made available about the issues in the CASE market.'' In addition, while only 24% of the survey respondents said that their organizations were now using CASE tools, almost twice that number _ 47% _ indicated plans to try CASE within the next 12 months. NELL MARGOLIS <<<>>> Title : 1989 will be the year of Author : CW Staff Source : CW Comm FileName: amyfor Date : Jan 9, 1989 Text: Most industry observers would agree that DB2 has become the standard for mainframe database management systems. But like the 3-year old emperor of China, DB2 may not yet be mature enough to live up to the responsibilities of the throne. There is irony in the fact that IBM's DB2 was accepted as a standard by much of the industry before it was able to handle the high-volume production applications that a mainframe DBMS should handle. In most cases, DB2 is still used for information center-type applications, not for the applications that run a business. 1988 will go down as a critical year for DB2, one in which many announcements were made laying the foundation and direction for DB2's future, promising that DB2 will grow to be a real production-class system. Users saw 30% to 35% performance improvements in 1988 as well as the announcement of much-needed functionality such as referential integrity and data sharing. In the coming year, IBM must start delivering on its promises. As production applications start coming on-line, DB2 will be put to the test. This will be the year that DB2 grows up and the irony is resolved. Inroads on IMS DB2 is just now beginning to be utilized for new production applications formerly reserved for IBM's IMS. ``We are now reaching a point where DB2 performance is acceptable to build business applications,'' says Bill Franks, group manager of technology at Frito-Lay, Inc. ``I still don't think of it as a true performance DBMS, but we're starting to feel more comfortable with it.'' Frito-Lay is now using DB2 for decision support in conjunction with IMS. Although the company will use IMS for years to come, Franks will consider developing new business applications under DB2. Another DB2 site considering production applications is Banc One Services Corp. in Columbus, Ohio. ``Performance improvements have made DB2 a whole lot more appropriate for production applications than in the past,'' says David Van Lear, president of Banc One Information Services. ``The number of transactions per second in the past prohibited many applications, so most DB2 applications were ad hoc rather than production environments.'' A recent survey of DB2 users conducted by International Data Corp., a market research firm in Framingham, Mass., found that the application mix is shifting from information center to production, with a surprising 47% of the respondents indicating that production applications were their primary applications. Many DB2 users are pursuing a dual database strategy, using IMS for production applications that require high volumes of transactions and DB2 for query- and information center-like applications. Ron Perlow, data manager at Manufacturers Hanover Corp.'s Corporate Systems division, says his organization, like many, is using DB2 as a query facility for IMS data. Each is well-suited to its own applications, he maintains, but DB2 needs more functionality, and IMS lacks a query facility. The dual database strategy means that the two databases end up being out of sync much of the time. Perlow has asked his developers for a strategic direction to go with either DB2 or IMS. IBM has made improvements to DB2 that make it a viable transaction-processing DBMS for the first time. However, many of the pieces in IBM's database strategy are just coming together. Roberto Montero, a database analyst at Chevron Information Technology, believes the lack of development tools in the past held back many applications. ``We would have had production applications four years ago if we had a delivery tool,'' he remarks. IBM's Cross Systems Product is widely regarded as inadequate, but in the past year, a vast array of tools made by third-party vendors have become available for DB2. Some contend that DB2 still needs critical functionality, such as a transaction manager, before it can be used for high-volume transaction processing. The problem, according to Gig Graham, an analyst at the Stamford, Conn-based Gartner Group, Inc., is that new DBMS technology is being used with old transaction management technology, like IBM's CICS and IMS. Many believe IBM will have to improve CICS's transaction management capabilities or come out with a completely new transaction manager for DB2 to really be up to par. Steve Laino, manager of database administration at Depository Trust Co., believes a lack of understanding of how DB2 works has kept many users from implementing production applications. Depository Trust currently has 40 production applications running under DB2, representing 13 million to 15 million SQL commands a day. Laino's shop processes extensively through CICS, he says, and although CICS is resource-consumptive, he has not experienced any bottlenecks. Laino maintains that as long as the resource-control table that resides in CICS is constructed properly, CICS can manage transactions just fine. Performance is critical if DB2 is to become widely used for production applications. Performance and referential integrity are prerequisites to allow distributed environments in the future. Until these capabilities are established, real distributed processing is not realistic. In October, IBM announced the first stage of its plan to allow distributed data in DB2. Along with referential integrity, IBM announced the availability of a single-update, single-read capability between two DB2 systems and initial support for multiple-read, single-update capability among multiple DB2 systems. This is the first step for many users who envision distributed environments in the future. Distributed data management is still a ways off, and IBM is slowly putting its strategy in place. Different strokes Distributed processing means different things to different people. Most distributed processing today consists of simple transactions, commonly called remote requests. In this type of transaction, the user queries a remote database and recieves a copy of the original data. This method is fine in many cases, but since the copy does not reflect changes made to the original database, the data is not up-to-date for long and the systems are quickly out of sync. The next phase of distributing access to data is what IBM has called the ``remote unit of work,'' wherein a collection of related SQL statements constitutes a unit of work. In a remote unit of work, a user can read and update a single, remote DB2 database within a unit of work. The transaction will go through only if all the statements have been successfully completed. If the unit of work is completed successfully, the transaction is committed; if not, the data is rolled back to its original state. While the application is executing, the data is protected by a locking mechanism. If multiple DB2 DBMSs need to be accessed, multiple units of work must be issued. This capability is currently available under IBM's VM/SP with SQL/DS. A remote-unit-of-work capability was announced for IBM's OS/2 Extended Edition a year ago; in October, it was announced for DB2. The next stage of distributed data is called, using IBM terminology, the ``distributed unit of work.'' The new functionality in this technology allows access to multiple relational DBMSs within a single transaction, or unit of work, rather than just one. This capability was aired in the October announcement of DB2 Version 2 Release 2. Under MVS/CICS or IMS/DC, a user will be able to read any number of local or remote DBMSs and write to the local DBMS. However, in a TSO or batch environment, a user could read many and write to a single local or remote DBMS. The next phase in distributing data is the distributed request. In this distributed nirvana, all transactions that can be done locally can also be performed in a distributed environment. This capability, however, is a long way off. Although distributed processing is being widely discussed currently, two things are clear: It will be several years before this technology is here, and users _ excluding a handful on the leading edge of distributed processing _ do not need it right away. The main impetus for distributed data access is to let end users have access to the information they need to make decisions and do their jobs _ regardless of where the data resides. This can be done by providing access to data, without the data itself being distributed. For instance, at Huntington National Bank in Columbus, Ohio, John Voss, systems development managerenvisions one database accessed by many branch locations, providing intelligence at the local level. Although not doing any distributed processing today, he sees the customer information applications coming on-line in the 1989-90 time frame. Voss says whether Huntington needs true distributed processing still is not clear. ``It sounds wonderful, but you have to look at the costs associated with it,'' he says, adding that new data admininistration and security problems are created. Nineteen eighty-nine will see the implementation of IBM's announced directions for DB2 _ including referential integrity and data sharing between DB2 databases. In addition, users will see further performance improvements that take advantage of the vendor's Enterprise Systems Architecture. IBM is also expected to unveil the pieces of its long-awaited repository. Many users today are building systems in anticipation of these capabilities being available when their applications are ready. So this year will be one of waiting for many users who are anxiously hoping IBM will deliver on its DB2 promises. By Amy Cortese; Cortese is a Computerworld senior writer. <<<>>> Title : Thousands of dollars are Author : Mark Breibart Source : CW Comm FileName: 4casting Date : Jan 9, 1989 Text: Thousands of dollars are spent by MIS managers each year on consultants' advice. Yet there is no way to measure the value of this service, and user needs or business trends are often more influential in determining a company's technology direction than an expert's recommendation. Nevertheless, managers continue to use consultants to gain additional insight, and the relationship between consultant and manager, like religion, is a matter of faith. ``These forecasts help stretch our thinking, but you have to look at your own company and its needs,'' says John Callahan, director of information resources management at Hershey Foods Corp. in Hershey, Pa. ``You have to have your own agenda.'' MIS managers keep the consultants' advice firmly in its place _ as raw data for their own evaluations or as support for a case they are trying to make. When it comes to crunch time, there is no doubt whose word wins. ``If there's a discrepancy between what I think and what I read, we look harder,'' says Mark Schmidt, director of technology at Wal-Mart Stores, Inc. ``But ultimately, I have to trust my own senses.'' Some executives are skeptical about forecasters. John Langenbahn, vice-president of information resources at Dayton, Ohio-based Mead Corp., says he takes it all with a grain of salt. ``They've been projecting things for 10 to 15 years, much of which never happens,'' Langebahn claims. He says he favors the pragmatic approach of waiting to see what really develops. But such strenuous doubters are in the minority. Most managers are hungry for all the insight they can get. All industry analysts are not created equal, however, according to discussions with some 20 information executives. Most often mentioned as being useful were those, like Gartner Group, Inc. in Stamford, Conn., that emphasize the technologies and their implications for MIS planners. Others that managers often turn to include Index Group, Inc. in Cambridge, Mass., or the Big Eight accounting firms, which focus on top-level strategies and training. Balloon busters Cautious before committing themselves to a new technology, especially one surrounded by the balloons and banners of the promoters, many managers call on their consultants to puncture the hype. When artificial intelligence was first gaining notoriety, Callahan found it intriguing but was not sure how well it fit in with his ``rather parochial'' industry, which tends to shy away from experimental technologies. He checked out the AI vendor himself, then sought a second opinion from the computer consultants at Arthur Andersen & Co. in Chicago, Hershey's accounting firm. He wanted to know ``whether this was a visionary approach that we should get involved in or whether we should use the Missouri approach to see if AI would stand the test of time.'' Arthur Andersen recommended that he wait. He did, and he is glad he held off, but future evaluations of AI are part of his five-year plan. MIS executives often need to check out the vendors as well as the technologies. Before committing large resources, managers want to make sure the supplier will be around for a few years and not follow other high-tech shooting stars into oblivion. Yet, this is often an analysis they need help with. ``We are not interested in having a staff large enough to do all those things ourselves, though we do have our own in-house experts,'' says Hans Huppertz, director of information systems at Dow Chemical Co. in Midland, Mich. When asked by Dow's agricultural research division about putting Oracle Corp.'s relational database on its IBM mainframe, Huppertz made one of his frequent calls to the Gartner Group for an evaluation of the future health of the Belmont, Calif., firm. ``It's like buying a little insurance for our own opinions,'' he says. When the Gartner Group came back with a positive report, the division got the go-ahead. The benefits from new hardware technologies are often more obvious _ or at least easier to judge _ than those of new software. But for planners, the timing of future purchases can be everything. For example, a Gartner Group forecast that erasable optical disks were coming down the road in a year or two changed the plans of Walter Perkowski, vice-president of computer operations at Republic National Bank of N.Y. He decided to put off buying more current-model hard disks and see if the optical devices really met Gartner's predictions. He's still waiting and will wait for ``another six months or so,'' he says. At McGraw-Hill, Inc. in New York, Richard Shriver does all he can to be prepared for new product announcements. He uses the forecasts of outside subscription services and of in-house experts, and he talks ``to people who profess to know as much as anyone outside the proprietary halls'' of the vendors. But since the unexpected is always possible, he hedges his bets. ``We go to great lengths to defer major mainframe purchases whenever we can,'' says Shriver, McGraw-Hill's senior vice-president and chief technologist, just in case an unanticipated product announcement is lurking around the corner. A well-timed delay of even a month could save the company up to $500,000, he adds. Shaping thoughts Beyond products and beyond trends, the research groups help shape the unconscious thinking of information executives in ways that cannot be pinned down completely. ``It's the kind of thing where you get an `Ah ha!' when you read it, but two weeks later you don't remember where you got the idea from,'' says Joseph Vincent, director of technical planning at Louisville, Ky.-based Humana, Inc. What is quite clear, however, is that no matter what analysts say is happening in the industry overall, managers decide on their direction based on their own applications and internal needs. Langenbahn, for example, who runs a large, multilevel IBM Systems Network Architecture network with mainframes, minis and micros, perked up at the notion expressed by some analysts that one of those three levels can be eliminated. Going to a two-tier setup would save him a lot of money, but, he says, being able to do it really depends on the needs of Mead's specific applications, not on what is going on elsewhere. Others echo that sentiment. When Gencorp, Inc. reorganized in the wake of a hostile takeover attempt, Linda George, the company's director of information services, needed to rethink her computer architecture. After carefully evaluating the Akron, Ohio, company's applications portfolio, she switched from IBM mainframes to ``lots of PCs'' and minis made by Wang Laboratories, Inc. There is, however, no clean demarcation between information managers' plans for their company and consultants' views of where the industry is going. It is more like a continuous feedback loop or barbershop mirrors, whose opposing images bounce endlessly back and forth. Portia Isaacson Wright, president of Future Think, Inc., points out that consultants' conclusions are developed for their subscribers only after speaking with countless users and vendors. When a consulting group asks 200 users for their opinions, for instance, ``there could be 200 variations on a theme.'' That would be chaos for the managers. By ``crystallizing the diverse opinions,'' she says, the analysts ``smooth the whole process of decision making. ``The beauty of it all,'' she adds, is that each user then gets a report that is really a distillation of the views of many users. And round and round it goes. <<<>>> Title : No managers need apply at Author : CW Staff Source : CW Comm FileName: 2manage Date : Jan 16, 1989 Text: ST. LOUIS _ When the first self-managed work group within McDonnell Douglas Corp.'s information systems organization was formed, it took the group 15 minutes just to decide when its first meeting should be held. ``Tuesday is bad for me.'' ``I have a dentist appointment Wednesday morning.'' It went on and on because there was no manager who could simply say, ``Meeting at 10 a.m. Tuesday. Be there.'' It isn't easy to get started in self-managed work groups, in which employees are their own bosses. But the vision of Wendell O. Jones, McDonnell Douglas' director of information resource management, is that self-managed groups will eventually encompass all of the functional areas of his 220-member corporate IS organization and change the way it is managed. The concept is not new to the company, although this is the first time it has been tried in IS. According to Jones, McDonnell Douglas was using the approach in its manufacturing operations. In fact, Jones said he sees the manufacturing experience as justification for such a change: ``There is a high level of employee commitment. . . . People feel empowered to be responsible for the quality of the process and product rather than having bosses or supervisors tell them what to do.'' But the self-management concept was not mandated for IS, and Jones said he was initially unaware of his company's experience in that area until he began considering it himself. ``I found out that what I am doing is being encouraged throughout my organization,'' he said. Jones' objective is to expand the concept's use as rapidly as possible, although there are situations _ such as one-person projects and geographically separated teams _ in which it will not be feasible. But although he conceded that some managers may not be ready for such a change, he added that ``most of my programmer and analyst teams in the MIS organization are functioning in [team efforts], and even those who don't function that way will understand that the overall culture of the organization encourages people to come up with ideas and encourages them to work as a team.'' ``One change I've noticed is that there is not a reluctance on the part of employees to take on a project'' or share an idea with the group, Jones said. Cooperation of all employees is needed for success, said Mike Bussey, a coach facilitator for the premiere self-managed group. ``Some people say, `All I want to do is my job; I don't want to manage myself or anybody else.' Others are gung ho about it; they think this is an opportunity to change the system.'' Chuck Gibson, senior vice-president at Cambridge, Mass.-based consulting firm Index Group, Inc., said participants must be willing to put in the kinds of hours managers might more typically work. ``There is a lot of emotional involvement and peer pressure in getting things done,'' he said. Dual roles To the upper echelon McDonnell Douglas managers, Bussey and his co-worker Betty Houchins are first-line managers of the 25-member Master Savings System (MSS) area. However, within the MSS, they are now considered members holding a coach facilitator role. The MSS maintains the corporate savings system activities, such as profit sharing. The group's mandate is to study its own functional area to find where and how it can be improved. ``The whole team is going to receive recognition for the work it achieves _not just me, not just Betty, but the whole team,'' Bussey said. The facilitators estimated that the group has spent close to a man-year in hours on the project since August. ``If you approach this with the idea of gaining immediate results, you're going to cause yourself frustration,'' Houchins said, adding that it could take up to two years to fully implement the plan as she envisions it. Bussey said he thought the MSS area was a logical department in which to kick off the program because its employees had always worked as a team; he soon realized that his sense of team spirit was a bit naive. ``There was always this silo effect of `We're in charge and we'll let you know what has to be done,' '' Bussey said. The team spirit has been improving with time, however, Bussey said. A recent request by the auditing department, he said, is proof; six people volunteered to do the work. ``Natural leaders emerge who will volunteer to do all of the coordinating. The team solidifies by itself,'' he said. In some cases, a group decision will not be as quick as a manager's, Bussey added, ``but it may be a better decision.'' The work groups planned for McDonnell Douglas are based in part on theories found in The Tom Peters Group's ``Leadership Alliance'' videotape program as well as the Japanese theory of Kaizan (which means continuous improvement). But theories alone do not make the idea work; what does make them work is support from supervisory personnel who must be willing to give up certain levels of responsibility and authority to the group, Jones said. While he added that he has heard no resistance to the plan, Jones said he suspects the transition to the work groups will be easier for younger managers than those who are solidly set in their ways. At first, many of the participants in the work group had trouble adjusting to making their own decisions, Bussey said. ``I have to turn to them and say, `Whose responsibility is it to make the decision?' It is up to me as coach facilitator to try to get people to know that it is not always up to me to make the decisions in the self-managed team,'' he said. Team spirit Once the group had agreed to try the management concept, it started to meet as a team. ``The first three meetings were extremely rough,'' Bussey said. ``We first had to define what is a team and what is self-management.'' The group enlisted the help of a human-relations facilitator who was familiar with the self-managed-team process. Next, the group drafted a list of 21 issues it thought needed to be addressed within the MSS and the total IS organization. Topics being addressed now include improving work flow, project management, group structure and procedures. One task the project management team studied was status reporting; it discovered this area cost the MSS department $192,000 annually. ``We figured that that was a little too much. They're going to look into that further,'' Bussey said. Still, Bussey said, the team has set some lofty goals, and things are moving at a slower pace than anticipated. There were many lessons to be learned, including teaching management techniques such as how to hold a proper meeting and how to handle disruptive behavior and conflict management, Bussey said. So far, Houchins said, the concept has received acceptance from those involved. ``I've talked to people and get the impression from everyone that if management decides they don't want to pursue it further, we'll decide how to continue it within our own [MSS] boundaries; they won't want to give up the ability to manage themselves.'' Taken to its ultimate, the self-managed work group concept calls for the self-managed groups to also select their own leaders, something that is not likely to take place at McDonnell Douglas any time soon, Jones said. By Alan J. Ryan, CW staff <<<>>> Title : AT&T puts Unix at arm's l Author : CW Staff Source : CW Comm FileName: drooling Date : Jan 16, 1989 Text: AT&T last week took steps to distance its controversial Unix software business from its hardware strategies with the official launch of a new software division, dubbed the Unix Software Operation. Lawrence Dooling, formerly vice-president of marketing and sales support at AT&T's Data Systems Group, was named president of the new business unit and will report to Robert Kavner, president of the Data Systems Group. The new business unit will consolidate Unix operations, including Pacific and European Unix divisions, and will be responsible for the development, marketing and licensing of Unix System V software. In an interview last week, Dooling said his responsibilities will be to maximize the growth of Unix software and revenue generated from it. He said his priority is to be responsive to customers _ Unix International and others _ and to address some of the criticisms for which AT&T has come under fire. Kavner first talked about spinning off AT&T's software operations last May, only weeks after taking the reigns of the Data Systems Group. As a supplier of both hardware and operating software that was licensed to other computer manufacturers, AT&T was heavily criticized for its handling and licensing of the Unix operating system. Has to be different Dooling said that for Unix System V to take off as a widely used operating system, ``Unix licensees have to be assured that it is a separate product available to meet a separate need, distinct from the Data Systems Group.'' Unix International, the advisory group of Unix System V licensees, was recently formed to provide input to AT&T on the development and licensing of System V. Dooling said AT&T has been meeting continuously with the group and has started the formal process. The separate software unit is also likely to minimize protests and appeals against AT&T government contract awards, since the Unix software and support will be identically available to all licensees, Dooling said. The Unix Software Operation will include responsibility for System V's graphical user interface, Open Look. The product, jointly developed by AT&T and Sun Microsystems, Inc., was submitted for consideration as the interface for the Open Software Foundation (OSF). However, the OSF's recently announced selection included not Open Look but the Hewlett-Packard Co./Microsoft Corp. Common X Interface combined with Digital Equipment Corp.'s Decwindows. Dooling said AT&T had made a professional submission in terms of technical, business and legal aspects but that he was not surprised it was not chosen. He also said that AT&T will continue to market Open Look, due this quarter, but would attempt to provide some compatibility with the OSF interface. However, if a standards body such as X/Open Consortium Ltd. or Posix were to specify an interface as a policy, AT&T would support it along with Open Look, he said. By Amy Cortese, CW staff <<<>>> Title : SEC awards $52M electroni Author : CW Staff Source : CW Comm FileName: edgar Date : Jan 16, 1989 Text: WASHINGTON, D.C. _ The U.S. Securities and Exchange Commission (SEC) awarded a $52 million contract last week for the EDGAR electronic filing system, ushering in an era in which firms will send financial reports to the SEC in electronic rather than paper formats. EDGAR, which stands for electronic data gathering, analysis and retrieval, is expected to replace the more than one million pieces of paper the SEC gets from public companies each year. The SEC awarded its high-profile contract to BDM Corp., a McLean, Va.-based systems integrator, bidding with subcontractors Mead Data Central, Inc., Sorg, Inc. and Bechtel Information Systems. Many agencies are jumping on the electronic filing bandwagon, allowing companies to submit government-mandated reports, tariffs and taxes via magnetic media or data communications [CW, Sept. 5]. A 1988 government survey showed that 68 federal agencies have an operational electronic filing system, and seven more have pilot programs under way. In essence, EDGAR has three subsystems: Data collection. EDGAR will enable companies (or their financial printers) to make required filings via direct transmission, disk or magnetic tape. Filers will begin using the system in several phases from mid-1990 to 1993, when all filers will be required to use EDGAR. Data analysis. EDGAR will help SEC staff members to retrieve and analyze financial reports at workstations. Retrieval software will be developed by Mead Data Central, known for its Lexis and Nexis on-line databases. Dissemination. EDGAR data can be viewed by the public and press at terminals located at SEC offices. In addition, Mead Data Central will sell bulk subscriptions to information vendors _ either via a real-time feed of EDGAR data or an overnight magnetic tape. The SEC said its pilot EDGAR system, built by Arthur Andersen & Co. in 1984, has been successful and collected more than 45,000 electronic filings from 525 volunteer companies. Along the way, however, the pilot system generated a lot of controversy and raised a host of new policy questions. Congress criticized the contract management; information brokers worried about unfair government competition; and bidders balked at the SEC's original financing plan [CW, March 25, 1985]. The contract award was a coup for Mead Data Central, which will distribute SEC information to the public and information resellers under government-regulated pricing terms. In addition, Mead Data Central said it will become one of the first customers of the regulated dissemination service so it can offer a commercial database of SEC filings to investment bankers, lawyers and financial services companies. In a statement released by Mead Data Central, Jack W. Simpson, president of the Dayton, Ohio-based company, called the SEC contract ``a model for the partnership between government agencies and commercial enterprises that will hasten the era of electronic information in the professional sector.'' By Mitch Betts, Cw staff <<<>>> Title : Reality check Author : CW Staff Source : CW Comm FileName: edit19 Date : Jan 16, 1989 Text: THE COLD LIGHT of day is upon us. That is the way IBM Chairman and Chief Executive Officer John Akers summed up the reality check he and his lieutenants have undergone during the arduous and sometimes painful process of attempting to change some of the most fundamental business practices of the giant company. In a wide-ranging interview with Computerworld (see story page 14), Akers makes it clear that the world to which IBM must respond and act within is far different from the world of just a few short years ago. Competition is measurably more fierce. Customers are decidedly more sophisticated and, therefore, much more demanding. Ultimately, the essence of success is not only keeping up with change but also anticipating it and recasting business strategies to exploit change. But if you read carefully, you will find that this is not a gauntlet that only IBM must run: It is almost precisely the same treacherous terrain that information systems organizations are coursing right now as we head toward the 1990s. Akers specifically addresses and enumerates these challenges to IS, making the parallels between the IS challenge and the IBM challenge all the clearer. For starters, senior management in U.S. corporations is increasingly demanding accountability from IS. Remember the days when few even dared ask what was up in DP/MIS for fear of getting an answer they wouldn't understand anyway? Then there is tremendous pressure from the swelling ranks of end users and others involved in the IS planning and implementation process _ people who today have to be persuaded, coddled and indulged like a real customer. Whatever happened to the old, impenetrable data processing bureaucracy? Finally, there is the IS manifest destiny that computer-based systems and networks hold the key to competitive advantage, forming an undeniable link between the performance of the enterprise and the fate of IS management. Wasn't it just yesterday that getting the payroll out on time and pumping out reports was considered a good week's work? If there is a single word that spells the difference between the information environment of yesterday and that of today and tomorrow, it is complexity _ both for the bellwether vendor and for IS organizations. Decisions made locally (at the business unit level at IBM, within the discrete IS department at other U.S. companies) today hold ramifications downstream throughout the entire organization. There are more mouths to feed, more hands to hold. Bullying is replaced by persuasion; audacity with tact. But perhaps most of all, IBM and the IS community have learned the bitter lesson espoused by a magazine editor in the 1950s: Nothing recedes like success. <<<>>> Title : A little care Author : Smoot-Carl Mitch Source : CW Comm FileName: carllet Date : Jan 16, 1989 Text: Regarding the recent articles about the Internet virus attack, one point that was not mentioned is that the fix to the Sendmail feature that allowed the initial assault was posted to Internet systems administrators more than two years ago. The fix to this particular problem has been well known for years in the Internet community. I might point out that this security hole has nothing to do with the inherent security or lack of security of the Unix operating system. With careful system administration, Unix can be made as secure as the proprietary systems of other vendors. Smoot Carl-Mitchell Managing Partner Texas Internet Consulting Austin, Texas <<<>>> Title : A few small things Author : P.E. Borkovitz Source : CW Comm FileName: borklet Date : Jan 16, 1989 Text: We at Via Information Systems Corp. congratulate Computerworld and Bob Moran on your article covering the release of Version 1.2 of our distributed database server, Via/DRE [CW, Nov. 14]. The article was both well-thought-out and well-articulated. We do wish, however, to point out one or two errors that, while small, could create major misunderstandings. First is the fact that for the 14 months it has been shipping, Via/DRE has had an implicit transaction control that automatically rolls back any single transaction that fails, for whatever reason. The feature that is being added is explicit transaction control that will roll back the entire transaction automatically if any single update fails. That feature is completing final beta testing and will be released within two to three weeks. Most importantly, we appreciate your attention to the real newsworthiness of a product that provides PC/LAN truly distributed database management today. P. E. Borkovitz President Via Information Systems Corp. Princeton, N. J. <<<>>> Title : No surprise Author : Augustino R. Luc Source : CW Comm FileName: lucentle Date : Jan 16, 1989 Text: After reading your two articles, ``Users lament Lotus delay'' and ``Add-ons simplify 1-2-3'' [CW, Oct. 10], I must say I'm not surprised that Lotus is two years late in coming out with its upgrade of 1-2-3. I believe that Lotus is two years behind because it has been trying to keep its market share by continually reassuring its customers that its product was near completion when it knew the whole time that completion was going to take longer. Another problem that Computerworld pointed out in ``Add-ons simplify 1-2-3'' is the other add-ons, such as the two that Marq Technologies and Personics Corp. have created, that interface with the Lotus program. By beating Lotus to the punch, companies such as Marq and Personics make the Lotus upgrade even less attractive to consumers who can easily purchase the add-on software. Lotus has proven that it cannot make timely adjustments to the ever-changing software market. It has lost respect as well as consumer patronage and now find its program as well as its upgrade in jeopardy. As a senior in the Business Management Information Systems Department at the University of Idaho, I feel that Lotus has indeed lost credibility. How about you? Augustino R. Lucenti Moscow, Idaho <<<>>> Title : The time is now Author : Abby Pinard Source : CW Comm FileName: pinlet Date : Jan 16, 1989 Text: David Baer's article, ``Cooperative processing'' [CW, Nov. 14] raised many interesting points. However, his conclusion and bias that ``. . . its time has not yet come'' is incorrect. Most major corporations do have underutilized, powerful desktop CPUs. More and more, those personal computers are connected to LANs, minis or mainframes. And more often than not, their larger processors are bursting at the seams or soon will be. Offloading many of an application's resource-intensive tasks _ such as user interface management and basic data validation _ to the PC while providing transparent access to a remote, shared data source such as DB2 not only can happen today but is happening today. Not all companies are ready for cooperative processing, and not all applications are appropriate for a cooperative solution, but many are. Moving ahead with cooperative technology is a very wise and cost-effective business decision. Abby Pinard Vice-President, Marketing Must Software International Norwalk, Conn. <<<>>> Title : Justifying your budget an Author : John Kirkley Source : CW Comm FileName: 1214kirk Date : Jan 16, 1989 Text: Just what do you do when you're sitting around the conference table at budget time and one of the senior managers _ the one who always taps his gold pen and pulls down the edges of his tight, thin mouth when he encounters a number over $500 _ says, ``I see MIS is looking for a 15% increase again this year. I'd like to know precisely what we're getting for our money.'' The game is called justifying your existence, and it is even harder to play in these unsettled and perilous corporate times. At these sessions, being able to show a correlation between information systems dollars spent and an improvement in the company's fortunes is most desirable. The game would not have to be played if MIS managers were cutting their budgets in 1989. However, budget trimming is not the trend, reports the Boston-based Index Group. The market research firm received responses from more than 10,000 senior MIS executives to a survey conducted in October. Nearly two-thirds of the respondents said they planned to increase their budgets by an average of 12.8% this year. Out of 20 concerns rated in the survey, cutting costs ranked No. 14 in the minds of these MIS managers. Yet many were aware that their bosses might not be taking as sanguine a view of IS expenditures. The Index Group's executive summary reported, ``While 79% of the IS executives disagreed that their organizations are spending too much on information systems, only 42% said their senior management would similarly disagree.'' Those who said their senior managers believe too much is spent on IS almost unanimously stated that management does not understand the real benefits from systems. ``They only see the expense side; there isn't enough communications of the value/benefits derived,'' said the IS chief of a chemicals company unit. ``Systems traditionally don't deliver the same kind of value that other departments do,'' chimed in the top IS professional at a consumer products company. Now, put yourself in the place of that cost-conscious senior manager mentioned at the beginning of this piece. There you are, tapping your pen on your IS manager's budget printout, listening to him tell you that systems traditionally do not deliver value as normally measured in other parts of the company. ``Just what . . . '' you might ask in a low voice tinged with a slight edge of menace, `` . . . just what do your systems deliver?'' Playing the field Deciding to do a little field work of my own, I called up several IS managers and asked how they went about justifying their budget. All had gone through the process many times during their long tenure. At one company, which was still recovering from the aftershocks and debt precipitated by an unsuccessful, unfriendly takeover attempt, the IS manager said that the current lean times had actually made justification very simple. ``We only do things that are absolutely essential to support the company's mission,'' he said. As a major textile manufacturer, part of that mission is to remain competitive in a global marketplace _ ``to beat the offshore competition,'' as he put it. To accomplish this goal, the company has made a substantial IS investment in electronic data interchange (EDI). ``We're reducing the response time from the moment the consumer in the store requests a particular item to the delivery of the finished goods to the retailer,'' he continued. ``These days, in the clothing market, it's not so much quality or cost that makes the difference; it's availability and conforming to trends. This means fast turnaround.'' By using EDI, his company has formed a partnership with the other suppliers in the cycle, all the way from the producer of the fibers to the manufacturers of the finished goods and the retailers. Even prior to the goods' leaving his company's shipping dock, an EDI message is sent to the destination detailing the materials being shipped _ color, dye lot number and even the sequence that the bolts of material were packed on the truck. ``We support strategic business objectives,'' the IS manager said. ``We help to give our company that critical competitive edge.'' Competitiveness was cited by several other managers. But, noted one IS professional at a large manufacturing concern, ``justification these days has become a lot harder. It used to be that you could install a payroll system and see an immediate, massive reduction in head count. ``Today, we're looking at `softer' justifications, and this is where top managers earn their money, making these tough decisions.'' Top management must be told, he said, that a new system might be just the tip of the spending iceberg. For example, an electronic mail system might have been initially installed for a simple, cost-justified reason; but users, being the intelligent, creative people they are, will quickly begin to find new applications for the system. Costs may soar. However, new and better ways of doing business should also emerge. War veteran One veteran of the corporate wars has devised a ranking of justification techniques based on degrees of subtlety. First is the ``standard old situation'' _ justification based on return on investment, such as reducing head count, eliminating time delays, displacing old costs with new systems _ particularly in telecommunications, she notes. Next, is a substantial face-lift of systems or adding new functionality. For example, in the banking industry, old and unreliable money-transfer systems need to be replaced. Here, you use the rubric ``modernization.'' Finally, and most difficult to justify, are systems introduced on the basis that they will pay off, but not in any bottom-line, immediately tangible way. Building better customer information files, installing a decision-support system and cleaning up databases all come under this category. Here, she says, is where decisions are made on intuition and gut feelings backed up by years of experience. In an ideal situation, the IS manager and top management would work together to devise information strategies. This joint planning would include a cooperative budgeting process. Justification would be a moot point. But in the real world, there will always be those managers looking at you down the barrel of their gold pens, asking you, in a tone of voice that suggests your answer will never be quite satisfactory, to justify your existence. By John Kirkley; Kirkley is a computer industry writer, editor and consultant based in Warwick, N.Y. <<<>>> Title : When the virus catches yo Author : Harvey Newquist Source : CW Comm FileName: virushar Date : Jan 16, 1989 Text: You're safe. The virile virus of 1988 passed you by. You assured both your CFO and COO that your data department is safe. As you lean back in your chair, legs crossed and arms folded behind your head, you even intimate that maybe _ just maybe _ the whole virus thing is a little overexaggerated. It is 3:30 a.m. You are sleeping soundly, dreaming of _ a ringing phone shatters your sleep. Wondering who besides a prankster would be making calls two hours before dawn, you mumble hello into the phone. It is your night shift supervisor, the man who oversees the input of transactions from the previous day. He is nervous. He thinks you better come in real quick, like now. His voice raises a little in pitch and volume. The main computer system has slowed down to a crawl. There's no available memory for any more input. Routines that normally take 30 seconds are averaging 15 minutes. All this has been going on since 2:00 a.m. He didn't think he should wait any longer to call. He has no idea what the problem is, but he senses it's going to get worse. At 4:05 a.m., you're standing in the middle of the data center, and no one is working. The night crew can't run any more jobs because all available memory is filled with nonsense data. Any attempt to get information out of the system just causes a whirring of disk drives for minutes before coming up blank. Blank! Your system is filled to the brim with data, but you can't seem to extract any of it, because something isn't letting it out. In your optimistic mood _ ``It can't happen here!'' _ you determine that there must be something terribly wrong with the system access. Probably just an inocuous little glitch, maybe part of a damaged operating system. At 5:30 a.m., you are sweating profusely, even though it is cold in here and colder outside. Your system has stopped. Immovable. Frozen. Dead. The computer claims that all disks and memory spaces are full but shows no data files. Every single record of every single transaction of every single department in the company is stored in those files _ accounting, payroll, personnel, sales, inventory and so on. What if those files are damaged? Most are regularly backed up, like accounting, but some . . . well, no use thinking about that yet. All you can think of now is your little remark to the boss about viruses and exaggeration. This is a virus if you've ever seen one. But, of course, you've never seen one. How many people have? Visions of data files . . . At 6:30 a.m., with thoughts of data files vanishing into magnetic oblivion, you call your regular systems maintenance people at home and tell them to get to work, pronto. Your prized MIS department has sat idly for hours. The night shift crew hasn't done anything but stand around the perimeters of the room waiting to go back to work. Most of yesterday's transactions are still stacked up at terminals waiting to be input. You now have your three most valued systems employees poring over terminals, scratching their heads and calling out strange commands to one another, all the while feverishly hammering on their keyboards. In one hour, your regular crew gets in to work. You decide to send the night shift home. The night supervisor, looking like a shell-shocked junior lieutenant, says he'll stay to help wherever he can. And he wants to know what the hell happened and when. It's 9 a.m. Now your day crew is standing idly around the perimeter of the room, waiting to work. The maintenance guys say they've isolated the core of the virus, but they're not sure what to do about it. They say it's pretty small, maybe only about 80K bytes in size, but that it's like a black hole _ it has eaten all of the space around it. What about the regular files? Is the data intact? They're not sure. One says yes, the other says no, the third is undecided. They'll be able to tell in a while. You run your hand through your hair for the 30,000th time in an hour. Thunder-lined memo At 10:15 a.m., you get a memo from the office of the CEO. You're sure he is unaware of the situation, but his memo is ominous. It reminds you that the company is preparing to send its annual report to the printers, so the accounting and marketing departments need their printouts today to verify any last-minute changes. Wonderful. The head of the marketing department has been harping on you for two weeks to be ready today, because he's got the printer standing by, and he's on a very close timetable. A missed day could set the project back by as much as two weeks. Noon. Your entire staff, except for the three maintenance people, have gone to lunch. They have done no work _ just watched the work load pile up. Your department is now nine hours backlogged, the equivalent of one working day. You have turned down three report requests and informed the CEO, COO, CFO and the marketing and accounting departments that your computer system is in- operable, and the situation is grave. They are not pleased. At 1:27 p.m., your maintenance people let out a shriek of delight. They have excised the disease, like surgeons removing a tumor. All data files have retained their integrity; nothing was seemingly damaged. As they slap each other on their backs, the computer room goes back to work. There is no way that you're going to get marketing its printout today, and accounting needs its work first. There goes the schedule for the annual report. Trying to tell them it is not your fault is like Nixon telling the nation he knew nothing about Watergate. It's your department, so it's your fault. The three technicians start talking about the virus. ``Boy, that guy sure knew what he was doing. He got in on the phone lines between shifts when no one was paying attention . . . .'' ``Brilliant! His program was tiny, yet it grew to the size of Godzilla in about two hours! You gotta be a genius to be able to create something that monstrous . . . .'' While you listen to them extol the virtues of hacking, you calculate the costs of this genius' work. Two shifts of work where no work was produced. Nothing input, no output. Transactions for almost two full days are backlogged. A setback in printing the company's single most important yearly document. Other departments are still waiting idly for needed reports in order to perform their corporate duties. They'll probably have until Monday, since today is Friday, and nothing is getting out of your department today, except accounting's printout. You will need to bring in a shift during the weekend just to catch up on last night and today. There goes this month's data processing budget. Your boss _ correction, your bosses _ are angry. They want a full review of the system and all of its shortcomings, implying that your operation is less than sound. There goes more money for an outside auditor. As you continue to hear about the genius of the virus, your only thoughts are of revenge. Who would do this and why? Your sense of overexaggeration is gone. This hacker invaded the sanctity of your domain, costing untold hours of your grief and the company's money. You now have a sense of proportion. You wonder if anyone has ever taken out a contract on a hacker before. By Harvey P.Newquist III; Newquist writes and consults on artificial intelligence and other advanced high-technology topics from his office in Scottsdale, Ariz. <<<>>> Title : The changing of the fads Author : Stanley Gibson Source : CW Comm FileName: stancolu Date : Jan 16, 1989 Text: Whatever happened to the industry battle cry of a couple of years ago: ``We don't sell products, we sell solutions''? You don't hear it so much anymore. On the contrary, the wheel has come full circle. Now the rage is, ``We sell the best/fastest/most powerful products.'' Benchmarks, of course, are part of the current marketing craze. The idea of selling a soup-to-nuts package that was part of a partnership in which the performance of products was a marginal issue seems to have fallen by the wayside. In the world of today's generic products, you need the optimal generic product. Forget the intangibles. Are we in the middle of a fad, which, like many fads, is itself a counterfad to the previous fad? Probably. But there is more to it than that. Users are unquestionably looking for reliable yardsticks to measure vendor claims. Users are looking closely at performance benchmarks in an effort to get the most for their computing dollar. Increasingly, users are being asked to play the role of sys ment was reached between Relational and DEC's Database Storage Group, rather than with DEC as a corporate totality. Another group within DEC could make an exclusive agreement of its own if it wanted to, I was told. Thus, many groups within DEC can make exclusive agreements, but none are exclusive from the point of view of the corporation. That makes sense: There are a lot of exclusive agreements, so none of them are exclusive. The impression I have from conversations with all parties is that the word exclusive meant a lot more to Relational than it did to DEC. Indeed, Cullinet might have approached DEC to strike the identical deal that Relational had but could not, because the wording of the DEC-Relational deal prohibited it. To that extent, the DEC-Relational deal certainly is exclusive. What DEC wants to accomplish is a difficult high-wire act. On the one hand, it wants as many strong agreements with software vendors as it can get: It wants a thousand flowers to bloom in its software field. On the other hand, it occasionally must offer something in return to these software firms. Hence, the kind of deal struck with Relational. Such deals, however, risk antagonizing the other vendors. Reportedly, upon hearing of the DEC-Relational agreement, a furious John Cullinane was on the phone to DEC, claiming that DEC had violated a policy against making exclusive agreements. But how angry can a software vendor get? It can stop writing software for DEC equipment, thus cutting off a stream of its own revenue. That wouldn't make much sense. The thousands of flowers blooming may resent each other, but they cannot afford to turn their back on the sun. Back to the TPC. In this space two weeks ago, it was noted that the Transaction Processing Performance Council, which calls itself the TPC, is charging its $5,000 annual membership fee as of Jan. 1, regardless of the dates on which various members joined in the past year, a fact that had rankled a member or two. Omri Serlin, the group's organizer and leader, responded, ``In general, I know of no industry consortium that is able to manage on such a low fee.'' He adds that knowing a single dues anniversary for all will remove questions of who is paid up when votes are taken. By Stanley Gibson; Gibson is Computerworld's senior editor, software. <<<>>> Title : Apollo says its supers ar Author : CW Staff Source : CW Comm FileName: bench Date : Jan 16, 1989 Text: CHELMSFORD, Mass. _ Apollo Computer, Inc. claimed that its Series 10000 Personal Supercomputer bettered the performance of a bevy of heavyweight competitors in a recent spate of performance challenges on both the East and West Coasts. However, the tests were conducted by Apollo _ not by an independent auditor. In an effort to buttress its claim that the Series 10000 brings the power of a supercomputer to the desk top, the work station maker pitted a reduced intruction set computing-based Series 10000 against an Amdahl Corp. mainframe, Convex Computer Corp. and Alliant Computer Systems Corp. minisupercomputers, Digital Equipment Corp. minicomputers and Sun Microsystems, Inc. workstations. Fortran-based mathematical programs, isothermal flow applications, a designer's matrix multiplication and circuit stimulation programs were all run during the November tests. The results from these tests were recently announced by the company. Apollo said its Series 10000 performed more than seven times faster than a DEC VAX 8600 minicomputer running a designer's matrix multiplication, six times faster than a Sun-4/260 workstation running a Fortran benchmark and three times faster than a Convex C-1 minisupercomputer using a Fortran-based mathematical program. By James Daly, CW staff <<<>>> Title : OSF adopts composite inte Author : CW Staff Source : CW Comm FileName: osfteck1 Date : Jan 16, 1989 Text: Striving to please the greatest number of developers and users as quickly as possible, the Open Software Foundation (OSF) recently announced a hybrid graphical user interface for its planned version of the Unix operating system. The choice, as anticipated [CW, Dec. 19, 1988], draws on a tool kit from Digital Equipment Corp. with user interface characteristics jointly submitted by Hewlett-Packard Co. and Microsoft Corp. The work of DEC in recruiting independent software vendors to develop for the X User Interface (XUI), on which Decwindows is based, paid off in the selection process, according to participants. Richard Treadway, Decwindows program manager at DEC, said some 300 independents have been working with the development tools. Meanwhile, HP and Microsoft's submission, which includes the behavior of Microsoft's Presentation Manager, was selected because many personal computer users who use the Presentation Manager will feel comfortable with the interface. ``Our goal is to bring the PC users to Unix _ also to say we are going to co-exist in a world of Presentation Manager and Unix-based systems,'' said John Paul, OSF director of development. Part of the HP-Microsoft submission was a three-dimensional appearance feature, which can cause a box on the screen to look as if it is raised or depressed. ``If you look at the OSF offering in June, it will look like HP's New Wave, including the 3-D appearance. It will look like Presentation Manager with 3-D,'' Paul said. The part of the graphical user interface that comes from DEC, the XUI-based development environment of Decwindows, will not be immediately apparent from the screen's appearance. Developers who have worked with DEC on its Decwindows program will notice no change. ``If you were an independent software vendor writing an application, you will be dealing with the same thing,'' Paul said. DEC is set to formally announce Decwindows tomorrow, although the firm actually began shipping Decwindows in December with Version 3.0 of its Ultrix 32 operating system. Development of Decwindows was complete six months ago, Paul indicated. ``DEC delivered us the most mature, best-performing tool kit,'' he said. Although the OSF could have chosen the totality of either DEC's or HP's submission, which would have been available right away, the group opted for a combined interface, which is an improvement over either of the separate offerings, according to Paul. He added that the hybrid interface should be completed and available in six months, a lead time he called acceptable. ``Timeliness is important. Our members said pick something you can deliver,'' Paul said. Paul said Open Look, the graphical user interface of AT&T (the OSF's archrival in the struggle to establish a standard Unix), was one of 23 user interface submissions that conform to the OSF's published mandatory guidelines. ``And their business terms were acceptable,'' Paul added. However, he declined to comment on why Open Look was not chosen. DEC and HP both claimed victory because parts of what they submitted were accepted. Both asserted that the choice will help establish Decwindows and New Wave as standards. ``HP is pleased because it gets a user interface that is consistent with that of New Wave, something that was needed to make New Wave the standard they wanted it to be,'' said Nick Fowler, marketing manager for user interfaces at HP. ``The hybrid is both technically superior and politically palatable,'' he added. ``The engine and drive train are the same, although the decoration and dashboard are new,'' DEC's Treadway said. The only differences According to Treadway, there are only two differences between the Presentation Manager-type interface chosen and the Decwindows interface that was not chosen. Under Presentation Manager, a user may operate without a mouse, which is not possible with the Decwindows interface. In addition, the method of invoking menus with the mouse is slightly different. Paul said there are other differences as well, although the two mentioned above are the most significant. By Stanley Gibson, CW staff <<<>>> Title : BASF to peddle -in. tape Author : CW Staff Source : CW Comm FileName: newbasf Date : Jan 16, 1989 Text: BASF Corp., which is a leading supplier of tape cartridges for IBM 3480 users, has announced plans to enter the U.S. -in. tape market. The organization, which has been selling -in. tapes in Europe for a year, said recently that it plans to market the tapes primarily to microcomputer users, although it will offer the tape cartridges to mid-range system users as well. The tape cartridges are compatible with the -in. products supplied by 3M Corp., BASF said. According to BASF, -in. tapes are increasingly used in microcomputer-based environments, particularly networked workstation setups, for back-up purposes. The cartridges, which are guaranteed for 5,000 passes, will be offered in five configurations, including the 5 -in. size and the 3 -in. mini cartridge size. Those two sizes will be offered in different tape lengths and recording densities. The tapes will be sold primarily through BASF resellers. Retail prices will range between $25 and $35 per cartridge, the company said. BASF said the tape cartridges are available now in limited quantities. A spokesman said the company expects to be shipping the products at full volume within three to six months. <<<>>> Title : Community Health Computin Author : CW Staff Source : CW Comm FileName: hwchc Date : Jan 16, 1989 Text: Community Health Computing, Inc. has unveiled an integrated radiology information system, called Radcare. The system reportedly encompasses several features, including information management, a financial system, diagnostic reporting (digital dictation and voice recognition), picture archiving and communications functions. According to the vendor, the product interfaces to the hospital information system and incorporates a fault-tolerant hardware platform. Radcare is available on a client-specified, modular basis. Pricing depends on specific configuration. Community Health Computing, Suite 2000, 5 Greenway Plaza, Houston, Texas 77046. 800-231-2604. <<<>>> Title : A disk processing system Author : CW Staff Source : CW Comm FileName: hwstorag Date : Jan 16, 1989 Text: A disk processing system specifically designed for use with super and minisupercomputers, imaging computers and high-speed data acquisition systems has been introduced by Storage Concepts, Inc. Concept 51 reportedly provides high-capacity storage via 5 -in. disks and allows one disk-controlling unit to support up to 63 disk drives. The system is packaged in a single rack-mountable chassis that incorporates a controller unit, data buffer, system power and cooling and over 6G bytes of storage. The product supports several buses, including Digital Equipment Corp.'s Qbus and Unibus. Prices for the controller start at $17,900. Storage Concepts, 1622 Deere Ave., Irvine, Calif. 92714. 714-852-8511. <<<>>> Title : Perma Power Electronics, Author : CW Staff Source : CW Comm FileName: hwpermap Date : Jan 16, 1989 Text: Perma Power Electronics, Inc. has announced a 1200VA version of its standby power system for micro and minicomputers. According to the company, the SPS-1200 protects against data loss and equipment damage resulting from blackouts, sags and brownouts and automatically protects against overload without the use of special load meters. The SPS-1200 costs $1,299. Perma Power, 5601 W. Howard Ave., Chicago, Ill. 60648. 312-647-9414. <<<>>> Title : SDI Co. has announced an Author : CW Staff Source : CW Comm FileName: swsdi Date : Jan 16, 1989 Text: SDI Co. has announced an IBM VM product designed to improve system performance by caching the VSE lockfile in processor storage for immediate access. According to the company, Cache Magic XA/LF eliminates the contention created by multiple VSE guests requiring constant access. The product carries a license fee of $8,500, and trial systems are available from the vendor. SDI, 1700 S. El Camino Real, San Mateo, Calif. 94402. 415-572-1200. <<<>>> Title : Software designed to addr Author : CW Staff Source : CW Comm FileName: swrelati Date : Jan 16, 1989 Text: Software designed to address batch problems for IBM DB2 users has been announced by Relational Architects, Inc. Called DB2/Batch, the product reportedly can run DB2 programs directly in batch without the use of conventional IBM JCL and allows batch DB2 programs to work compatibly with job-scheduling packages and multistep job streams. DB2/Batch costs $4,500 for a perpetual license or it can be rented for $270 per month. Relational Architects, Suite 2341, 175 5th Ave., New York, N.Y. 10010. 212-966-0010. <<<>>> Title : AT&T and Intercon Systems Author : CW Staff Source : CW Comm FileName: swatandt Date : Jan 16, 1989 Text: AT&T and Intercon Systems Corp. have introduced a package designed to add multilevel security features to database management systems. Called the Trudata Model 3BBL, the product is said to be compatible with AT&T Unix System V/MLS secure software and the Unix operating system. It incorporates audit trail facilities as well as tracking by terminal access, date, time and type of access, according to the vendors. Reportedly developed for use with any DBMS that has an ANSI-standard SQL interface, Trudata Model 3BBL is priced from $20,000 to $40,000, depending on quantity ordered and system configuration. AT&T, National Product Center, 1 Speedwell Ave., Morristown, N.J. 07960. 800-247-1212. <<<>>> Title : SQL not Dbase forte, test Author : CW Staff Source : CW Comm FileName: dbug Date : Jan 16, 1989 Text: TORRANCE, Calif. _ All along, critics argued that Ashton-Tate Corp. would not be able to produce an effective implementation of SQL in its Dbase IV product, charges the company vehemently denied. Now that Dbase IV is out, it seems the critics were right after all, at least regarding Ashton-Tate's first crack at SQL. According to Fred Luk, president of Quadbase Systems, Inc., who has tested Dbase IV, users may get incorrect results from even simple SQL queries. Luk tested Dbase IV against his own product, Dquery, which provides Dbase III Plus with SQL capabilities. Ashton-Tate has not denied his findings and pledges to correct Dbase IV flaws. With some queries, Dbase IV found an incorrect result. With others, it failed to run because of internal errors. At other times, Dbase IV's SQL gave inconsistent results depending on the method of querying. Fortunately, an Ashton-Tate spokesperson said, some fixes should already be available through the Compuserve on-line service. Ashton-Tate officials also said the SQL problems are very data dependent, only cropping up when the product is used in unusual and specific ways. The biggest problem for Ashton-Tate was its lack of SQL experience. Unlike many SQL developers with years of experience, Ashton-Tate had to build its complex SQL system nearly from scratch. Another problem for SQL programmers is the lack of support for nulls _ a way of dealing with missing data _ said Fabian Pascal, an independent relational database consultant based in Washington, D.C. Others have criticized the performance of the SQL. The central problem with the performance is that SQL was designed with so-called set processing in mind, where data is manipulated in entire tables. Dbase IV, however, maintains its record-at-a-time database engine. Translating between the two is both difficult and slow. SQL bugs accompany problems in other areas of the product, including numerous incompatibilities with Dbase III Plus and problems with the Run command [CW, Nov. 12, 1988]. No browsing Calculated fields (where data values are computed as entered) do not work properly either, said noted Dbase teacher Adam Green. Calculated fields work only with Browse and do not work with Edit of Append. Nor does the documentation match the product, Green argued. For example, the manual provides messages that the Range command is supposed to give. The range command, however, fails to provide these messages. There is also little backward compatibility for Dbase IV users that take advantage of the new memo fields feature. Any Dbase IV database that uses memo fields can not be read by Dbase III, Dbase III Plus or any Dbase compatible. Although Ashton-Tate never promised full backward compatibility, the new memo fields will present a major problem for firms with a mix of Dbase III and Dbase IV users. ``People are going to have to share data,'' Green said. ``There is not going to be an instant upgrade.'' By Douglas Barney, CW staff <<<>>> Title : Jaguar revs worksheets fo Author : CW Staff Source : CW Comm FileName: jaguar Date : Jan 16, 1989 Text: Next month, end users will get a first-hand look at Jaguar and King Jaguar, two new spreadsheet compilers from Sheng Labs in Beaverton, Ore. The two products are said to compile worksheets into executable files that are compatible with Lotus Development Corp.'s 1-2-3 and other spreadsheets, yet do not require that end users own the application program. Jaguar and King Jaguar support 1-2-3 Versions 1.0a, 2.0 and 2.01. The company said it will also support Release 3.0 when it becomes available. According to Sheng Labs, the spreadsheet compilers offer the advantages of speed, because the compiled application runs faster than interpreted programs, and security, because end users cannot alter the executable file. Jaguar, which retails for $395, features a symbolic debugger that supports standard, conditional and trace breakpoints; minimum recalculation technology, which speeds execution of an application; expanded memory support to enable end users to use worksheets too large for conventional memory; and support for Intel Corp. 8087 and 80287 math coprocessors. These features are also included in King Jaguar, which retails for $595. It features a user-defined function that enables developers to define in C language functions not available in 1-2-3 and virtual memory that expands automatically to a hard disk when the application exceeds random-access memory. Jaguar, King Jaguar and the compiled applications run on the IBM Personal Computer, XT, AT, Personal System/2 and compatibles with at least 384K bytes of RAM and require IBM PC-DOS or Microsoft Corp. MS-DOS 3.0 or higher. By Michael Alexander, CW staff <<<>>> Title : Unix winners and losers Author : William Brandel Source : CW Comm FileName: intelfac Date : Jan 16, 1989 Text: The Open Software Foundation's (OSF) graphical user interface selection isn't for workstation users only. A closer look at the interface's ramifications indicates that it will have a distinct effect on the personal computer industry as well. For those of you who were not glued to industry happenings over the holidays, the selected graphical user interface is a hybrid that includes DEC's X Window User Interface (XUI) as its application programming interface and HP's and Microsoft's Common X Interface (CXI) as its user interface (see story page 25). CXI is the look-and-feel portion of Presentation Manager/X (PM/X) and will essentially be a port of the Presentation Manager for OS/2. The announcement is a boon to HP and Microsoft because it legitimizes their PM/X efforts. But more important is the challenge the OSF poses to Presentation Manager and OS/2. Because of the Presentation Manager's current shortcomings and the viability of the PM/X and XUI hybrid, PC users may finally be given a sound reason to consider running Unix on their PCs. As already voiced by users, Presentation Manager is flawed. It lacks device drivers and, therefore, does not yet support many of the popular monitors and printers. It lacks any software applications support. It requires a ton of memory and is optimized for Intel 80286-based machines. In short, users must spend a lot of money for a system that really does not do anything. This means a slow ramp-up speed for Presentation Manager and affords a window of opportunity for the Unix community. War is still on After slowly gaining momentum and then being ravaged by consortium wars, one large faction in the Unix community has agreed on an interface based on the industry-standard X Window System. According to major DOS vendors, the OSF interface has similar memory requirements to Presentation Manager. But unlike Presentation Manager, the OSF selection is optimized for the machines it runs on. This gives the OSF interface a technical advantage and could possibly accelerate the Unix momentum in the PC market. Some may argue that a Unix-based interface and its technical specifications are not significant to the PC market today. But according to Tom Kucharvy, president of Boston-based Summit Strategies, who analyzes the high-end PC market, half of Intel processors being sold today can in fact run Unix. With a solid interface technical specification to write to, it is entirely feasible that solid Unix applications for the PC could make it to market in the same time frame as those written specifically for Presentation Manager. Accordingly, those applications developed for Presentation Manager can be converted to run PM/X without a total rewrite. These developments mean that Unix is now poised to make a run on the PC and also that users have an interface alternative that is superior to what is currently being offered to them. Meanwhile, Presentation Manager faces some serious challenges in the PC industry's near future. By William Brandel; Brandel is a Computerworld senior writer. <<<>>> Title : Practicing what they teac Author : CW Staff Source : CW Comm FileName: gainesvi Date : Jan 16, 1989 Text: Even math teachers have trouble practicing what they teach from time to time. But now, teachers at the University of Florida in Gainesville, Fla., are themselves using personal computers to learn how to turn kids on to numbers. The PCs run software called ESC Elementary and Middle School Mathematics Curriculum, which is marketed by Education Systems Corp. in San Diego, Calif. According to Mary Grace Kantowski, professor of mathematics education at the university, the program allows teachers to evaluate and practice the school's mathematics curriculum before introducing it into the classroom. The program is unusual, she said, because ``we're not teaching teachers about the computer; we're teaching them with the computer, so they can go back and teach their students with the computer.'' Math overhaul Kantowski, who is a 32-year teaching veteran, explained that the National Council of Teachers of Mathematics is developing new curricula and evaluation standards for grades one through 12. More emphasis is being placed on teaching prealgebra, geometry, probability and statistics. Kantowski added that many middle school teachers are not prepared to teach these subjects at that level. Middle school teachers are also more prone to suffer computerphobia and misconceptions about the technology, Kantowski added. If they are familiar with the personal computer, they tend to view it more as a word processor or number cruncher, not as a vehicle or medium for delivering instruction. She also pointed out that most teachers rely on textbooks _ not technology _ as an intregral component of their instruction. Student teachers learn to teach with textbooks by designing lessons and practice teaching before they go on to actually teach a class of their own, she said. ``This program was designed to continue that experience but with software curriculum,'' Kantowski said. ``It shows teachers how to extend the curriculum on the computer into the classroom.'' She said she feels that one of the most valuable aspects of the program is that it gives teachers an opportunity to work extensively with material before they have to teach it. Learn by seeing Kantowski said the program's use of sound and colorful graphics is especially valuable to student teachers because it is ``much more effective to be able to rotate a cube in actual 2-D or 3-D.'' During the upcoming spring semester, Kantowski will instruct several area teachers two days per month on the system. The teachers will first learn the content and then construct material. Their goal is to learn how to present and integrate what they have learned into classroom activities. The ESC program comes in a compact-disk format and is currently running on a local-area network at the university. The LAN consists of IBM Personal System/2s and Tandy Corp. personal computers, each with 640K bytes of memory. By Sally Cusack, CW staff <<<>>> Title : Merging safely Author : CW Staff Source : CW Comm FileName: softips Date : Jan 16, 1989 Text: When merging an Ashton-Tate Corp. Multimate file with a Dbase file, make sure the field names in the merge document are typed in exactly the same way as the field names in the Dbase file appear. Multimate will not merge correctly if the field names are different, even if the cases are different. Information provided by Corporate Software, Inc., a Westwood, Mass.-based software reseller. <<<>>> Title : Ami melds graphics, word Author : CW Staff Source : CW Comm FileName: am Date : Jan 16, 1989 Text: End users say they can bridge the gap between word processing and desktop publishing software with Samna Corp.'s Ami, a graphics-based word processing package. Ami, which makes use of Microsoft Corp.'s Windows, enables users to perform graphics-oriented functions with a what-you-see-is-what-you-get type of environment. John Fischer, corporate office systems manager at Helix Technologies Corp. in Waltham, Mass., needed a product that could begin adding windowing to his corporate office systems. Fischer said he is using Ami to ease these migration efforts. ``We're not a big Windows house and are still trying to decide what areas we want to use Ami in,'' Fischer said. ``As [Ami] includes features of both word processing with some of the flavors of desktop publishing, we are relying on it while we're still in an in-between mode. There isn't another product like it on the market.'' Andy Lipscom, president of East Valley Graphics, Inc., located in Rydal, Ga., agreed. ``Unlike Microsoft's Write program, Ami performs real word processing under Windows,'' Lipscom said. ``Samna has done an excellent job of taking advantage of windowing to create a word processing product.'' Lipscom, whose company designs graphics-based software, is using the product to process documents. He believes that Ami can replace Xyquest, Inc.'s Xywrite as the document processing system in the personal computing environment. ``Xywrite suffers from its ad hoc status,'' Lipscom said. ``They started with a good editor and then added every conceivable feature available.'' Negotiating through Xywrite's array of features is the key problem. ``With Xywrite, you have to figure where the hell each of its added features are and then figure out how to use them. Ami was designed with ease of use in mind,'' he said. But, Lipscom added, Ami is not perfect. Lipscom cites the users' inability to change documents from Xywrite to Ami as the product's most obvious flaw. This obstacle stems from the fact that Ami lacks an import file box that will accept the old Xywrite documents. But Lipscom said that he is aware that Samna is working on correcting this problem. A Samna spokeswoman said that the interim version of Ami, which will precede the release of an Ami Professional offering, will enable the user to convert Xywrite files. The spokeswoman said that in the meantime, the product will accept Xywrite files in ASCII. The Ami product, priced at $149, allows the user to select from pull-down menus and command options and get a real perspective of the document before it is printed. The program also includes 25 different style sheets that can be used for layouts of memos, business letters, newsletters and reports. Ami will run on an IBM Personal Computer AT or Personal System/2 and compatibles with Intel Corp. 80286 or 80386 microprocessors equipped with 640K bytes of random-access memory. By William Brandel, CW staff <<<>>> Title : Wang Microsystems, a divi Author : CW Staff Source : CW Comm FileName: micwangm Date : Jan 16, 1989 Text: Wang Microsystems, a division of Wang Laboratories, Inc., has extended the high end of its PC 200/300 series with the introduction of two Professional Computer models. The 16-MHz PC 381 reportedly provides IBM Personal Computer AT compatibility and was designed to meet basic customer requirements in the business personal computer market. Pricing starts at $3,195. The PC 382, running at 20 MHz, is especially suited for graphics and communications applications, the vendor said, and is priced from $3,450. Each system features an eight-slot chassis and offers 1M to 4M bytes of base memory, with upgrades to 16M bytes. Both products are offered in a variety of configurations, including a broad range of memory, disk controller, fixed- and removable-disk storage and monitor support options. Wang Microsystems, 10 Technology Drive, Lowell, Mass. 01851. 800-962-4727. <<<>>> Title : Imagetech Corp. has acqui Author : CW Staff Source : CW Comm FileName: micimage Date : Jan 16, 1989 Text: Imagetech Corp. has acquired rights to the recently introduced Marvin Software personal computer-based imaging product. The software reportedly enables users of IBM Personal Computer ATs, Intel Corp. 80386 machines and compatible systems to scan text and graphics documents, as well as store, retrieve and work with the digitized images. The package is menu-driven and mouse-oriented and runs under Microsoft Corp.'s Windows and MS-DOS. It operates with Canon U.S.A., Inc. and Fujitsu America, Inc. desktop scanners. Marvin Software costs $695. Marvin Plus, which includes data compression and decompression logic functions, is priced at $895. Imagetech, 1864 Northwood Drive, Troy, Mich. 48084. 313-362-3141. <<<>>> Title : A mouse enhancement produ Author : CW Staff Source : CW Comm FileName: micgalac Date : Jan 16, 1989 Text: A mouse enhancement product for users of Xyquest, Inc.'s Xywrite III Plus package has been announced by Galactic Software, Inc. Called Xymouse 2.11, the software reportedly accelerates word processing tasks and integrates hard-disk inventory control functions. A hard disk is strongly recommended but not required. Xymouse 2.11 costs $149. Galactic, Suite 211, 2600 Philmont Ave., 1 Fairview Plaza, Huntingdon Valley, Pa. 19006. 215-947-0930. <<<>>> Title : All-American Software Dev Author : CW Staff Source : CW Comm FileName: micallam Date : Jan 16, 1989 Text: All-American Software Development Corp. has released an Apple Computer, Inc. Hypercard database management application. Tabularium, according to the organization, was designed as a tool for importing data from Apple's Macintosh, DOS and mainframe databases into Hypercard. The software imports to new stacks or adds to existing stacks, the vendor said, and is priced at $49.95. All-American Software, 5612 International Pkwy., Minneapolis, Minn. 55428. 612-537-8910. <<<>>> Title : LANs: Closet data centers Author : Thomas L. Nolle Source : CW Comm FileName: nollecol Date : Jan 16, 1989 Text: When asked in a 1982 survey what negative aspect of personal computer growth they feared most, data processing managers fretted over a potential loss of central information control. The managers believed PCs would encourage the ``privatization'' of information by erasing the traditional link to the central computer complex. This image has stood the test of time: Recent advertisements by a minicomputer vendor portray this same image of data banditry but in the context of local-area networks. We've survived the wholesale movement of corporations to PCs without realizing the worst of those 1982 fears, but most businesses can point to episodes in which there were problems because of the gradual diffusion of data into nonintegrated systems. In fact, the problems that can occur when everyone owns a private database _ unknown to others and not subject to any form of audit or verification _ are one reason why the integration of work groups via PC LANs is a growing priority. But might the cure be a step toward a more serious disease? Take a real-life example. In 1983, a major New York bank had to create special batch files to preserve data on its LANs in original form. Standard features such as record and file locking didn't stop people from making private copies of files and modifying them _ resulting in multiple, inconsistent versions. No one had anticipated the problem, and it cost thousands to correct. The bank was involved in an early distributed processing application, one of the very first to use PCs. The fact that the networks of that time did little to provide formal support for central database protection led users, vendors and consultants to approach with caution the development of applications that relied on those databases. But this year, LANs will have technical facilities allowing them to be what many have claimed they were all along: substitutes for departmental or central information processing resources. That could give users a false sense of security in dealing with the kind of problem the bank had six years ago. In products like Novell's System Fault Tolerance Netware, Microsoft's OS/2 LAN Manager, IBM's OS/2 LAN Server and the growing number of SQL servers, the state of LAN technology has advanced to create for the first time an environment in which basic DP controls over information storage and retrieval can be applied. These products provide transaction backout, ghosting/ duplexing of disks to maintain parallel databases and activity audit trails developed through centralized smart servers. Such features will surely eliminate some of the barriers to making a LAN the equal of a minicomputer or mainframe. Vendor literature seems to promise an environment that takes care of itself. It's a wonderful dream, but this encourages users to not think about the technology, and that's always bad. These new features are simply tools. The questions are who will apply them, enforce their use and regularly test their operation to ensure their continued integrity. Nearly half the LANs in use today have no real centralized administration function. Sure, most have a designated administrator, but that person often does little more than assign passwords and access rights. In one manufacturing company, the administrator resigned and the position remained open for almost six months because no one knew it had ever been filled. Get serious If LANs are to grow into real work group support status, then businesses must begin to take LAN administration seriously. Few businesses would permit a minicomputer to support 100 on-line users without any operations staff, yet these same users will permit LANs twice that size to exist without formal administration. If a LAN is a substitute for walking a disk to the next office, this may be survivable; if the LAN functions as distributed work group support technology, it may not be. Companies need to appoint a qualified administrator who participates in critical application design decisions, selection of application software, review of proposed configuration changes that might affect the community of users and more. In short, the LAN needs a DP specialist to administer the collective data processing activities it is called on to support. Perhaps the reason that this almost self-evident conclusion seems to be resisted in practice is that PCs and LANs are seen by end users as a way to wrest control of their destiny away from bureaucratic MIS types. Having deposed one unwanted monarch, these users are reluctant to raise up another. But ultimately, there is only one choice. OS/2, LAN Manager and Lan Server, database servers, gateways, bridges and other emerging technologies are pulling the LAN away from the transparent and populist piece of wire that so effectively cut back on pedestrian traffic in work groups. To the extent that LANs are substitute DP resources, they must be administered as such. By Thomas L. Nolle; Nolle is president of CIMI Corp., a communications consulting company based in Haddonfield, N.J. <<<>>> Title : Mac link replaces mail at Author : CW Staff Source : CW Comm FileName: arcoapp Date : Jan 16, 1989 Text: LOS ANGELES _ Atlantic Richfield Co. is looking to tie field operations to its headquarters with a modem-based link between depot users' Macintoshes and Macs on an Appletalk local-area network located here. Today, users at Arco's gas depots are mailing their Apple computer, Inc. Macintosh disks filled with shipping information to the main office of Arco Products Co., the marketing distribution arm of Arco, headquartered here. The information is then loaded into Macs and sent to the company's Amdahl Corp. mainframe. Mailing the disks from depots to headquarters has proven cumbersome and time-consuming, said Tom Mullaney, manager of systems and administration at Arco Products. So Arco is looking at Portland, Ore.-based Inforsphere, Inc.'s Liaison, software that would allow remote users to send information electronically from the depots into the home-office Appletalk network. Macs on the Apple LAN are linked via ``plain old telephone wire,'' according to Mullaney. There are two LANs composed of 50 Macintosh SEs at headquarters and two more connecting 30 Mac SEs at a separate site in Los Angeles, which serves as the main office for Arco's Paypoint system. Paypoint allows consumers to buy gas using their automated teller machine cards. The four LANs are linked using Hayes Microcomputer Products, Inc.'s Interbridge, an add-on device that links multiple Appletalk networks via Hayes modems. The Appletalk LAN at Arco Products' headquarters is connected to an Amdahl host using Tridata Corp.'s Netway 1000 gateway. The host stores customer files, pricing and tax charges and daily production records. One obstacle to deploying the electronic file transfer application between the remote and local sites is the depots' current use of 2,400 bit/sec. modems. The company needs to install 9.6K bit/sec. modems to surmount the current ``interminable wait for data transfer,'' Mullaney said. He said he hopes to solve this problem with the latest release of the Applefax Modem, which is said to support file transfer at speeds up to four times faster than standard 2,400 bit/sec. modems. Applefax enables users to send facsimile files to and from virtually any type of fax machine. It also enables users to exchange data files with other Macs equipped with Applefax modems. It was taken off the market in October to fix bugs in the software and firmware, but the revised version will ship as of Macworld later this month, according to Apple. Other facets An electronic link between depots and headquarters is only one of several ways that Arco is employing computers to increase the efficiency of its operations. Using Macdraw II, Apple's drawing program, depot staff members create diagrams of various Arco service stations for delivery-truck drivers. ``That way, he knows where the tanks are,'' Mullaney explained. ``He can figure out the best way to pull in to reach them.'' The maps are stored in a graphics database. Performance information _ including the number of miles trucks have traveled, how many hours those journeys took and the number of hours employees have logged on those trips _ is recorded using Microsoft Corp.'s Works. Other data reported includes a daily record of how much gasoline is delivered and the safety records of each driver. Mullaney said between 400 and 450 gallons of gas are shipped monthly from Arco's storage depots. The bulk of the performance information is entered into Macs at those sites. ``They do the groundwork by accumulating the information that goes into our overall performance analysis,'' Mullaney said. That data is vital in determining the price of gas to the service station franchisees, he noted. By Julie Pitta, CW staff <<<>>> Title : Data seen increasing on T Author : CW Staff Source : CW Comm FileName: muxdata Date : Jan 16, 1989 Text: MARLBORO, Mass. _ Data communications, in particular IBM Systems Network Architecture transmissions, will make up an increasing percentage of the traffic carried by T1 multiplexers in the next two years, according to a recent study. ``The Multiplexer Market, 1988 _ an End-User Assessment'' is a study put together by The Market Information Center, Inc.'s Comm/Surv group. The research firm, based here, surveyed approximately 325 U.S. organizations, of which nearly 30% used T1 products and services. Of those companies, 26.4% were Timeplex, Inc. users; 12.1% were AT&T users; 12.1% were General Datacomm, Inc. users; 7.7% were Digital Communications Associates, Inc. users; and 7.7% were Network Equipment Technologies, Inc. users. More than 15 additional vendors supplied the remaining companies' T1 facilities, the research firm said. The average amount of traffic generated by companies surveyed consisted of 52% voice, 47% data and 1% video, the study found. Based on user responses, however, Comm/Surv predicted that in two years, voice traffic will make up only 47% of total traffic, while data's share will increase to 51%. The makeup of data communications traffic will also change over the next two years, the study found. The 1988 survey found that, on average, user traffic was 29% asynchronous, 20% bisynchronous and 44% IBM Systems Network Architecture/Synchronous Data Link Control (SNA/SDLC), with the remainder a variety of data types. Based on user responses, the study predicted that during the next couple of years, asynchronous traffic will decrease to 27% and bisynchronous to 13%, while SNA/SDLC traffic will account for approximately 50% of the whole. Product reliability, vendor stability and customer support were the top three criteria that users use in choosing a T1 multiplexer, the study found (see chart page 37). By Elisabeth Horwitt, CW staff <<<>>> Title : U.S.-USSR network opens Author : CW Staff Source : CW Comm FileName: janbits Date : Jan 16, 1989 Text: Jan. 1 saw the inauguration of a packet-switching data network between U.S. and Soviet Union users. The net was provided by San Francisco/Moscow Teleport, Inc. in cooperation with the National Center for Automated Data Exchange of the USSR. The network is said to make U.S. database services available to Soviet subscribers. It is accessible through several local carriers' services, as well as through Globenet, Inc. Ungermann-Bass, Inc. has contracted to purchase IBM Systems Network Architecture products from Rabbit Software Corp. Under the agreement, the network vendor will pay Rabbit royalty fees over a two-year period. Infotron Systems Corp. recently made its first shipment of Streamline 45s, which it claims are the first T3 networking multiplexers to be marketed. Two of the units, which reportedly support transmission speeds of up to 45M bit/sec., were delivered to Sungard Recovery Services, Inc., a disaster-recovery service vendor. Cabletron Systems, Inc. has inked a pact with LSI Logic, Inc. to develop a 10BaseT-standard transceiver chip. Cabletron will have sole rights to market the new chip, which will support 10M bit/ sec. Ethernet over unshielded twisted-pair cable. 3Com Corp. and France's Telesystemes Reseaux have signed an agreement to jointly develop an X.400 gateway and X.25 router. 3+Open Reach/X.400 and 3+Open Internet/X.25 reportedly will enable a mix of personal computers, including IBM Personal System/2s and Apple Computer, Inc. Macintoshs on 3Com networks, to exchange electronic mail via public E-mail systems and proprietary systems such as IBM's Professional Office Systems and Digital Equipment Corp.'s All-In-1. The two products are slated to be available in French and English by mid-year. Prime Computer, Inc. has announced an X.400 tool kit said to allow 50 series supermini software programmers to develop E-mail, gateway and other messaging systems within local- and wide-area networks. Information can be formatted as text, graphics, digitized voice, image, facsimile and electronic data interchange transactions. Future kits will support Unix-based systems and Ethernet LANs, the vendor said. The second phase of Prime's X.400 strategy entails an end-user mail implementation, with a PC interface slated to ship late this year. <<<>>> Title : Big doings down under Author : CW Staff Source : CW Comm FileName: bitsy Date : Jan 16, 1989 Text: TOORONGA, Victoria, Australia _ Coles Myer Ltd., a retail company based here, is building what reportedly will be the world's largest private digital backbone network. The network will link Coles Myer's approximately 1,500 retail outlets throughout Australia, as well as planned sites on other continents, the company said. It is expected to supply the retailer's voice, data, image, text and video communications needs through the late 1990s. Datacraft Australia Pty. Ltd. will design and install the backbone, which will be based on T1 multiplexers from Newbridge Networks Corp. The backbone will support the company's IBM Systems Network Architecture network, its Digital Equipment Corp. Decnet of Vaxclusters and a voice network made of NEC Corp. NEAX 2400 digital private branch exchanges. One key feature of the Newbridge/Datacraft proposal was the use of Newbridge's 3612 Narrow Band Multiplexer, which allows voice and data transmissions to be integrated over one channel that supports 64K bit/sec. or lower rates, Newbridge said. <<<>>> Title : Plus Development Corp. ha Author : CW Staff Source : CW Comm FileName: netplusd Date : Jan 16, 1989 Text: Plus Development Corp. has announced an expandable hard-disk subsystem for personal computer local-area networks. Called Plus Impulse, the drive reportedly achieves a 12-msec effective access time and offers the ability to connect up to 32 drives onto a single PC LAN server. The product provides a maximum data-transfer rate of 4M byte/sec., according to the company. Plus Impulse is priced from $995 to $1,379, depending on drive capacity and components. Plus Development, 1778 McCarthy Blvd., Milpitas, Calif. 95035. 408-434-6900. <<<>>> Title : A high-capacity storage s Author : CW Staff Source : CW Comm FileName: netepoch Date : Jan 16, 1989 Text: A high-capacity storage server for technical workstation networks has been announced by Epoch Systems, Inc. Said to offer Sun Microsystems, Inc. Network File System compatibility, the Epoch-1 Infinitestorage Server combines magnetic and optical disk drives to provide 150G bytes of storage capacity. The servers are available in three versions and are priced from $95,000 to $450,000 depending on configuration and storage requirements. Epoch, 313 Boston Post Road W., Marlboro, Mass. 01752. 617-481-3717. <<<>>> Title : Automated Design has anno Author : CW Staff Source : CW Comm FileName: netautom Date : Jan 16, 1989 Text: Automated Design has announced its Windows Workstation family of local-area network utilities for users of Microsoft Corp.'s Windows and LANs. The software reportedly includes an integrated menu system as well as a network printer and background security control functions. The system may be purchased modularly or as a package, with prices ranging from $695 to $1,195. Automated Design, Suite 112, 133 Johnson Ferry, Atlanta, Ga. 30068. 404-988-0969. <<<>>> Title : E-Soft, Inc. has revised Author : CW Staff Source : CW Comm FileName: netesoft Date : Jan 16, 1989 Text: E-Soft, Inc. has revised its multiline bulletin-board system and communications handler. Version 2.1 of The Bread Board System (TBBS) reportedly offers multiline operation of up to 16 incoming telephone lines on a single IBM Personal Computer XT, PC AT, Personal System/2 or compatible. According to the vendor, enhancements have been made to the software's menu operations, message bases and file-transfer functions. TBBS 2.1 costs $895. A single-line version is also available for $299.95. E-Soft, Suite 2550, 15200 E. Girard Ave., Aurora, Colo. 80014. 303-699-6565. <<<>>> Title : A software-based network Author : CW Staff Source : CW Comm FileName: netrybse Date : Jan 16, 1989 Text: A software-based network security system that offers boot protection through controlled access to the hard disk has been introduced by RYBS Electronics, Inc. Called Gatekeeper, the product runs on all IBM micros and compatibles; it works with IBM's PC-DOS 2.0 through 3.3 and most versions of Microsoft Corp.'s MS-DOS, the vendor said. Gatekeeper costs $79. RYBS, 2950 Central Ave., Boulder, Colo. 80301. 303-444-6073. <<<>>> Title : 3X U.S.A. has announced i Author : CW Staff Source : CW Comm FileName: net3xusa Date : Jan 16, 1989 Text: 3X U.S.A. has announced its 3X-Link16 system, which reportedly connects up to 16 personal computers, laptops and IBM Personal System/2s and offers background sharing of peripherals. The system consists of software, a parallel-port add-on device and a 12-ft cable, the vendor said; files can be transferred in background mode at $500,000 bit/sec. The product is priced at $199 for the first two stations and $139 for each additional station. 3X U.S.A., One Executive Drive, Fort Lee, N.J. 07024. 201-592-6874. <<<>>> Title : A software package that a Author : CW Staff Source : CW Comm FileName: netemerg Date : Jan 16, 1989 Text: A software package that allows IBM Personal Computers, PC XTs, ATs and compatible systems to act as an X.25 bridge between multiple Ethernet local-area networks has been announced by Emerging Technologies. Called ET/Bridge, the product reportedly works in conjunction with off-the-shelf communications cards to convert PCs into effective Ethernet X.25 bridge nodes. Each PC is said to be capable of supporting up to 64 remote LANs for an unlimited number of users at speeds up to 64K bit/sec. Et/Bridge costs $1,095. Emerging Technologies, P.O. Box 1525, Mineola, N.Y. 11501. 516-742-2375. <<<>>> Title : A tradition of technology Author : CW Staff Source : CW Comm FileName: dartbox Date : Jan 16, 1989 Text: At Dartmouth College, a small liberal arts school in the snowy hinterlands of New Hampshire, technology is part of tradition. This harmonious situation exists largely due to the influence of John Kemeny and Tom Kurtz, who created the Basic language there in 1965 and initiated a campuswide time-sharing network. Since then, the college's centralized computing operation, organized around the Kiewit Computation Center, has spearheaded a major commitment to computing. Today, there are more than 6,000 Apple Macintoshes on campus (with just 5,000 students), and 85% of all students will purchase a computer during their undergraduate careers. And though it is relatively small in size, Dartmouth has big plans. The college and its library have initiated an integrated information retrieval system called Dartmouth College Information System (DCIS), while the Thayer School of Engineering there has embarked on Project Northstar, an ambitious multiphase development plan. DCIS will combine a user-friendly windowed interface on the Mac, the high-speed Kiewit campus network and a variety of databases that will give campus users access to new and better educational tools. Eventually, DCIS will provide the ability to combine numerical, textual and compound documents, as well as video and audio. The goal of Project Northstar, which started in 1986, is the development of computerized curriculum materials and the construction of a communications network to deliver programs to faculty and students. According to Dan Lynch, associate dean of Thayer and principal researcher on the project, Northstar will eventually incorporate 400 high-powered Unix-based workstations. Even Dartmouth's exceptionally receptive computing atmosphere is not immune to conflicts, however. Because Dartmouth's scientific/engineering program is relatively small, Northstar fights an inherent prejudice against the value of Unix. With Macintoshes everywhere on campus, it has become a political battle to get support for the minority operating system. GLENN RIFKIN <<<>>> Title : It's a computer-based con Author : Glenn Rifkin Source : CW Comm FileName: campcop Date : Jan 16, 1989 Text: ``Here, there's no bottom-line measure; we're trying to produce knowledge, not profits.'' James Poage Vice-Provost for Computing Dartmouth College Knowledge may not translate easily into dollars and cents, and institutions of higher learning may disdain terms such as competitive advantage, but a bottom line does exist even for those who dwell in ivory towers. And there is an increasing awareness among information systems professionals on America's campuses that the bottom line is dependent on information technology. ``The currency of academia is reputation,'' says Kenneth King, president of Educom, a consortium of 574 universities that deals with information and technology issues for higher education. In order to attract and retain the best faculty and staff in an ever-shrinking talent pool, universities and colleges are finding that they must offer more than picturesque quadrangles and fully stocked libraries. State-of-the-art computing is becoming a priority requirement for campus America. Few university information systems executives would argue with the forecast offered by John Sculley, chairman and chief executive officer of Apple Computer, Inc. at a recent meeting of Cause, the professional association for computing and information technology in higher education. ``The real structure of universities in the 21st century will not be bricks and mortar,'' Sculley said. ``It will be information systems.'' In fact, if anything, some analysts might advance the timetable slightly. King, for example, says he believes that ``the effective use of technology will play a major role in a school's ranking in 10 years.'' Achieving the levels of technological sophistication that will soon be necessary to keep up _ let alone stand out _ will not be an easy task for university information systems executives, however. For most, it will mean an ongoing financial and strategic struggle. Despite the fact that most technological breakthroughs are spawned on campuses, the average institution of higher learning is well behind its corporate cousins in embracing technology. The average cost of computing as a percentage of total budget in higher education is just 2.5%, which is up just 1% in the last five years, according to King. ``Universities are in an information-intensive business and should track the same as the business world,'' he says. ``But universities on average are five years behind in expenditures.'' One problem is that colleges and universities, unlike private industry, cannot always easily measure the benefits of technology and factor them into fiscal plans. Both state and private institutions face financial uncertainty in a time when expenditures on technology should be rapidly increasing. The financial crunch is particularly acute at state schools, which must depend on the largesse of government officials. Says one campus chief information officer, ``The university has identified computing as a high-priority item, and the Board of Regents has endorsed it, but as far as the governor and legislature are concerned, that's a long way from being cash.'' As a result, many schools seek relationships with vendors that can provide equipment grants and discounts. ``We're cheap, we're always looking for a handout,'' says Thomas West, assistant vice-chancellor of computing and communications resources for the California State University system. ``I wish it didn't have to be that way.'' At many schools, including prestigious research institutions, the proliferation of personal computers and workstations has also produced considerable financial nightmares for traditional centralized computing centers. As computing power has become distributed to individuals and departments, the need to buy time on mainframe cycles has diminished, and computing centers that used to support themselves with the surcharges on time-sharing cycles are in trouble. ``It's a major problem, and budgets are being cut,'' says King, adding that while computer center budgets are dropping, expectations are rising. Researchers, in particular, are demanding access to national academic networks such as National Science Foundation's NSFnet, which gives them time on one of several supercomputer centers in the country. And money is not the only problem. Information systems professionals on campus face a host of other challenges when they try to extend the reach of technology. Although these challenges vary dramatically depending on the size and culture and funding status of the institution, there are problems that span the whole range. For example, the toughest assignment at any institution of higher learning is finding consensus in a world of independent thinkers and autonomous groups. ``On campus, no one will stand up and insist on anything,'' says Robert Street, vice-president for information resources at Stanford University. ``We operate more on the carrot principle than on the stick principle.'' Nowhere is the gap in thinking wider than between academic and administrative computing on campus. Not unlike the separation of church and state, these two bastions of campus power have traditionally kept a safe distance from each other. But in the last five years, many campuses have sought to bring the two environments together in hopes of achieving certain economies of scale. Not everyone agrees with the trend, and obstacles continue to plague such a union. Administrative MIS veterans like Sam Splice, director of telecommunications and administrative systems at the University of Michigan, actively lobby to keep the administrative and academic worlds separate. ``Our environment wouldn't work for them, and theirs wouldn't work for us,'' Splice says. ``We'll cooperate where we can.'' Illustrating the division between the two sides was the reaction to a proposed merger of Cause (an organization of individuals on the administrative side) and Educom (academic side). At a recent Cause conference in Nashville, Splice, along with a vocal contingent of administrative information systems veterans, worked to keep the two from merging. ``In the academic world,'' Splice says, ``everybody thinks he is a manager. [Academics] want access to everything but they don't want to support the same packages or protocols; they are all over the place.'' High expectationsSplice says that academic users want support and response, and they cannot understand when the systems people do not move as quickly as they expect. ``Our databases are huge and hierarchical and are tough to navigate through,'' Splice says. ``We try to serve both sides but demands on us have grown as the technology has matured. The question is, Who provides the infrastructure?'' The answer to that question varies from campus to campus. At Stanford, the decision to reunite administrative and academic computing in 1986 has proven to be a critical and financial success, according to Street. ``We have a $45 million operation,'' he says. ``With something that big, you can take advantage of the economies of scale.'' The university needed an IBM 3090 Model 300 with vector processing to handle its peak computing loads during the day. Street sought to cushion that investment by actively selling time on the free cycles during the evening. He offers these near-supercomputer capabilities for a low rate ($40 per hour) to faculty and students who need the large-scale computing power. The university provides funds for users without external support and charges the research grants of those with support. Unfortunately, many schools have had less positive experiences. At the University of Southern California (USC), the academic director of budget planning looked at the numbers involved in supporting three separate groups _ academic, adminstrative and scientific and engineering _ and decided in 1984 that all computing should be placed in one center. The vice-president of finance, who had contentedly built a financial system around Prime Computer, Inc. equipment, declined to participate. But the IBM group, representing the main computing center, and the Digital Equipment Corp. group, from science and engineering, set out to merge. According to Dianne Bozler, director of USC Software Systems, it was a mismatch of cultures from the beginning. The director of the IBM installation was put in charge, and although the IBM side was made up of environments such as SAS and SPSS, the systems staff was handpicked from the DEC environment. The DEC people quickly learned the IBM protocols, but that was not enough. ``We had a bunch of guys who had come from this exciting DEC environment who were told `No more hacking. We have paychecks to run,' '' Bozler says. For a year, the university stayed the course, bringing in consultants to help facilitate what was turning into a painful marriage. Deadlines were missed, reports were not delivered and service declined across the board. ``It simply didn't work,'' Bozler says. ``There was a total inability to understand that you just can't bring a system down in the middle of the day. If they needed to run something, they just brought the system down with no warning at all, and we couldn't stop that behavior.'' Decisions as to the coupling or uncoupling of academic and administrative computing continue to be hammered out in large institutions. But that is hardly the only pressing issue. Organizing strategically to serve the varied constituencies will determine success in a competitive marketplace. ``The problems are not all that different from major corporations, but higher education tends to look within itself rather than to the corporate world for solutions,'' says R. Schuyler Lesher, a consultant at Nolan, Norton & Co. in Lexington, Mass., who has spent 12 years viewing academia's use of technology. Lesher reports that often in higher education, strategic planning neglects information technology issues; worse, the plans center around how schools are currently organized rather than how they might be in the future. Some of the major planning challenges facing information systems on campuses are: Networks. Networks have become a priority for information systems professionals in higher education. Information access to the larger academic community is becoming a measure of competitive advantage at a school. Researchers and students are now demanding access, but the road to connectivity is filled with potholes. Laying a network across traditionally hierarchical organizations, not to mention diverse academic and administrative computing environments, is no easy task. Infrastructures must be introduced, and that takes careful strategic planning. ``Putting a network on an old hierarchical structure takes some real visionary planning,'' Lesher explains. ``They have to look to new models.'' Adding to the confusion is the proliferation of national and international networks that serve academia, such as BITnet and NSFnet. Richard West, assistant vice-president of information systems and administrative services at The University of California, can attest to just how tricky networking can be. West faces a laundry list of networking issues. He oversees the computing environments for the entire University of California system, which encompasses nine campuses. He recognizes that each campus must have autonomy, but a strategic direction is critical if all of the campuses are to benefit from information technology. This is no easy task since there are 15 major computing centers on the campuses and 14 have standardized IBM MVS environments for administrative computing. On the academic side, computing evolved on a peer-to-peer basis, and thus Transmission Control Protocol/Internet Protocol (TCP/IP) became a de facto standard. TCP/IP-based networks are fine for certain needs, Richard West says, but there is no central network management, and therefore, no one knows if the network is working. The University of California, Los Angeles is currently working with IBM to support the interoperability of TCP/IP and IBM's Systems Network Architecture. Keeping abreast of new technology. Without deep pockets to fund infusions of new equipment, many schools struggle with outdated systems. Hooking new technology, such as workstations, into these patchwork quilts of homegrown code and cut-and-paste networks is no simple task. And just when computing center managers thought they had a handle on PCs, Steve Jobs unveiled his Next, Inc. machine and targeted it squarely at the university market (see story page 47). ``The campus grows obsolete at a frightening rate,'' Dartmouth's Poage says. ``This year's seniors will graduate with 128K-byte Macintoshes, which are now just boat anchors. How do you keep the faculty and staff up to date with what the freshmen have?'' Old-line DP types who have been shifted into strategic MIS positions must be ready to oversee and lead that new wave or risk getting swamped by it. Are they up to handling it? ``No,'' says California State University's Tom West. ``The Peter Principle is definitely at work. Out of 19 campuses in our system, we've seen an information systems leadership change on 12 of them in the last five years.'' Security. Computer viruses tend to strike first on campus, particularly at high-tech-oriented schools such as MIT or the University of California, Berkeley. Widespread use of Unix makes a rampaging virus significantly more dangerous. Information systems managers on campus are less concerned about hackers, however than the potential threat to data integrity inherent in opening databases to the end users sitting at PCs and workstations around campus. ``There's no reason not to give students access to grades and registration material, but we're holding up on that because we're afraid to let students into the databases,'' University of Michigan's Splice says. Library automation. Universities and colleges are scrambling to get their libraries on-line and automated, but millions of volumes, reports and other printed material make this a costly and time-consuming task. At Dartmouth, the library was the first entity to get involved with computers, according to Margaret Otto, Dartmouth's head librarian. And, she claims, Dartmouth was among the first to get its card catalog on-line and to develop a computer-based acquisition and control system. Still, even with an early lead, keeping up with demand is a challenge. Faculty and students now expect 24-hour access to libraries from their workstations, and as disk storage technology improves, full-text availability will become a must. ``The library building may become obsolete down the line,'' Otto says. Impact on education. Finally, perhaps the most crucial planning question concerns what impact computing will have on the actual learning process. Although most undergraduates are computer literate, and the PC is common enough to be considered part of the furniture, its impact on learning remains unclear. John Kemeny, former Dartmouth president and creator of the Basic language, says he believes that the computer is an expository tool but that it does not necessarily change what is learned. Lawrence Levine, director of user services at Dartmouth, claims the computer is just one small factor in a student's life and it is difficult to separate out just what it does for the learning process. ``Students have access to more information, but do they write better papers? No, they write longer papers, which are easier to read.'' ``We'd like to see computing affect the entire course curriculum,'' says Stanford's Street. It hasn't happened yet, he says, ``but we think it will.'' Ironically, the most financially secure institutions are not necessarily at the forefront of technology. ``I've seen as much reluctance to make changes at schools with money as at schools without money,'' Lesher says. ``I've seen some real innovation at small schools which have taken advantage of commercial software packages out there. Smaller schools often don't have the burden of a huge infrastructure that has to be changed or gotten rid of.'' Keeping up with the Joneses In fact, size of the institution does not alleviate the concern that some of the most technologically impoverished departments can be found side by side with the most heavily endowed. ``There is definitely a battle between the haves and have-nots,'' University of California's Richard West reports. ``Faculty members in electrical engineering and computer science don't have enough space to put in all the machines vendors give them, while the art historian has to beg for a PC. There is definitely tension.'' In the long run, Richard West says that these arguments will become moot. ``We've got to start viewing computing as part of the university infrastructure just like the library, offices, desks and telephones,'' he states. ``The cost of recruiting the top students and faculty is going up. Prospective faculty now ask, `Do I get a workstation? Do I get supercomputing time? Am I on the national network? Is there staff to write software for my course?' We haven't made the commitment yet, but it's becoming a priority.'' By Glenn Rifkin; Rifkin is a Computerworld senior editor. <<<>>> Title : Animating the classics Author : Bonnie MacKeil Source : CW Comm FileName: perseus2 Date : Jan 16, 1989 Text: A group of professors from Boston University, Harvard University, Bowdoin College and Pomona College are working to breathe life into the study of classical Greek civilization. The means for this effort is Perseus, a software program now being tested in classes at Harvard and Bowdoin. The program is scheduled for completion in December. The program, which has been in development since July 1987, is set up as a hypermedia database using Apple's Hypercard. According to Greg Crane, director of the Perseus project, the system gives students a more vivid view of this ancient and complex period. ``We're trying to tie together as many different kinds of information about a rather messy subject [the reconstruction of the world] and tackle it as fluidly as possible,'' Crane says. Using Perseus, he explains, students are able to call up a map of Greece, point to a section of the country and view a map of that area. By clicking on a specific city within that area, they can get more detailed information, such as the presence of artifacts and ruins. If they need more information, they can call up articles about the subject. The program also enhances the study of Greek literature. When reading a play, for example, a student can re-create the staging, positioning the actors on the screen and trying out entrance and exit possibilities. This level of visualization changes the way students think about the text, Crane says, and gives it an immediacy that it never had before. According to Crane, students learn more in less time using Perseus, and they get more excited about their work because the payoff for their time investment is bigger. ``They are able to learn things that even professional scholars weren't able to learn before,'' he adds. By Bonnie MacKeil; MacKeil is a Computerworld researcher. o <<<>>> Title : Rehearsing on a small scr Author : Sharon Baker Source : CW Comm FileName: creation Date : Jan 16, 1989 Text: Students of the performing arts will soon have a new medium on which to fine-tune their productions. The Creation Station, a software program designed to run on Next, Inc.'s Next machine, will help these artists-in-training generate, edit and produce a play, dance or video on computer. The software was designed by David Gregory, director of the Center for Performing Arts and Technology at the University of Michigan in Ann Arbor, along with colleagues Hal Brokaw and Henry Flurry. The program gives students more hands-on exposure to fine arts. Gregory envisions universities setting up Creation Stations in individual rooms, much like practice rooms in a music department, in which both students and faculty can conduct individual work on their own time. ``The bottom line is that creative artists are going to have a really potent tool for either creating things or for simulating things,'' Gregory says. The Creation Station permits the manipulation and integration of animation and music. A choreography student, for example, can create a floor pattern by moving a three-dimensional representation of a dancer across the screen with a mouse, terminating the action by clicking the stop button. He can then create two more patterns on two separate tracks. A simple touch of the play button will allow the student to simultaneously review the three steps he just established. Once the steps are choreographed to the student's liking, music can then be added to another track, and the multimedia performance can be played back to see how the dancers relate to the music before rehearsals take place. The system, expected to become available in the second quarter of 1989, was demonstrated at the recent Educom conference held in Washington, D.C. By Sharon Baker; Baker is a Computerworld assistant editor, features. <<<>>> Title : No rest in administration Author : Patricia Cinelli Source : CW Comm FileName: demand Date : Jan 16, 1989 Text: More and more demands are being placed on administrative computing services at colleges and universities. According to Carol Barone, vice-president of information systems and computing at Syracuse University in Syracuse, N.Y., working in administrative systems now means constantly learning new ways of doing things in jobs that will not remain static. Universities have always been complex organizations, but they have never been forced to work quite as hard at management as they now must do in order to contend with mounting competition for quality students and faculty, financial pressures and escalating requirements for accountability. For all of those reasons, notes Joe Wyatt, chancellor of Vanderbilt University in Nashville, colleges and universities are increasingly dependent ``on the prompt availability of critical information in a form usable by decision makers at various levels in the organization.'' In many cases, prompt availability of information is synonymous with direct access. Barone says that demand is rising on her campus for products that allow personal computer users to manipulate data on a mainframe. ``Our clients want information faster and want to do more with it,'' she says. ``They want to analyze it themselves so they can make better decisions.'' What that means, however, is a much heavier responsibility for the information systems organization in terms of both support and guardianship. At Syracuse, the support aspect has been handled, in part, through the reassignment of staff from applications groups to administrative support groups that provide services to staff and faculty. Safeguarding systems integrity in the more open environment is a more difficult problem. ``Fifteen years ago, a system assumed a certain ethical standard. Now, security has to be built into applications,'' Barone says. ``In the past, we had supervisors authorize access to records by hand. Now, [user demand dictates that] we have an automated security system that checks individuals against authorization records. With end-user computing, it becomes important to provide access based on the individual's need to know.'' And it is not only faculty and staff that are stepping up the pressure on administrative systems. Government reporting requirements are placing greater demands on administrative computing, particularly in the area of financial aid. At the University of Maryland in College Park, for example, Butch Reinke, director of the administrative computer center, says the amount of data the university's financial aid officers are asked to collect on each applicant for financial aid has almost quadrupled this year. Another finance-related area is adding to the systems work load. Fund-raising efforts at many colleges and universities have taken on a new urgency of late, and that means more dependence on information systems. According to Kenneth Pollack, vice-president of information resource management at Wright State University in Dayton, Ohio, college administrators are constantly asking, `` `Who is our audience? How [can we] put together a consistent consolidated approach to raising capital, keeping track of donors and alumni, distributing correspondence and following up?' To effectively perform these tasks, you really need the support of flexible computer programs,'' he says. Washington, D.C.-based Gallaudet University, a small liberal arts institution with about 2,000 students and 2,000 faculty and staff, recently began using PCs and modems to contact its hearing-impaired alumni and potential donors on telecommunications devices for the deaf (TDDs). These messages previously had to be manually typed on TDDs for each call. Kevin Casey, director of computer services, says easing the contact process decreased phone time and enabled development personnel to increase the number of donor contacts made in a campaign. Just making the contact, however, is only part of the job. At Wright State, the kinds of questions that potential donors raise when they are contacted helped to suggest yet another new activity for the systems department. At Wright, computers are now helping the university conduct institutional research that it hopes will provide answers not only for curious potential donors but also for others with questions, including government officials, administrators, parents and faculty. Recruitment is another area of systems demand that keeps evolving, according to Pollack. In fact, the phrase now used at Wright State is ``enrollment management,'' which sums up the more sophisticated approach being pursued in prospect identification. The prevailing philosophy is that keeping students is at least as important as attracting them. By Pacricia Cinelli; Cinelli is a free-lance writer based in Washington. <<<>>> Title : Satisfying both technolog Author : Stewart H. Robin Source : CW Comm FileName: netsides Date : Jan 16, 1989 Text: It is not uncommon for designers of campus networks to feel pulled in two different directions. This is because most colleges and universities house two distinct and different sets of users with sharply divergent network requirements and support expectations. One set, those interested in technology for its own sake, demands state-of-the-art tools. They know what they want and prefer to find their own mix. The hardware platforms that these technology-oriented users utilize generally include Sun Microsystems, Inc. workstations, Apollo Computer, Inc. equipment, Digital Equipment Corp. Microvaxes, IBM Personal Computer RTs and Apple Computer, Inc. Macintoshes. Unix is their operating system of choice except for the Macintosh, which still requires Apple's proprietary operating environment. Networking is done via the Transmission Control Protocol/Internet Protocol suite. The only intervention this group usually wants from the central support organization is assistance in campuswide quantity discounts or site licenses and provision of campuswide network backbones interconnected to the external world. At the other end of the spectrum are users who view technology as a means to an end. They want systems that are easy to use, along with full support services. These solution-oriented users are pragmatic in their approach, although not necessarily unsophisticated. The often-referenced personal computer power users fit into this group just as well as low-level users. These users emphasize packaging and expect universal answers to all their needs within a strict budget. This group's platform choices tend to be IBM Personal Computer ATs and Personal System/2s or clones and Macintoshes, and these users prefer networks that do not alter the operating ``feel'' to which they have become accustomed. It is difficult, if not impossible, to find uniform integrated systems that will satisfy both groups. But the two populations often overlap in terms of geography and work, so the network designer must create structures that suit separate interests but are also compatible enough to permit communication between both sets. At the University of Michigan, we currently have Banyan Systems, Inc.'s Virtual Networking Software, or Vines, as the PC local-area network of choice and Apple's Appletalk as the preferred Macintosh LAN for our solution-oriented users. We offer full support and training, detailed network configuration and assistance in network installation. We also offer a limited amount of support and some informal training for 3Com Corp.'s 3+. We are currently reviewing Novell, Inc.'s status, with the hopes of offering full Netware support. We recently installed twisted-pair wire within our campus for data and voice and tried to minimize the need for more extensive specialized wire. As part of our efforts to explore this media, we have been working with Northern Telecom, Inc.'s Lanstar high-speed packet transport. We have found the Lanstar to be a good transport for users of Vines and Netware as well as Macintosh IIs. We now support it for those users when there is a great enough concentration to make it cost-effective. Farallon Computing, Inc.'s Phonenet is supported for all Macintoshes. Ethernet and token-rings are occasionally used. The solution-oriented users generally accept whichever of these strategies is most cost-effective. The technology-oriented users generally prefer Ethernet, so we also have extensive Ethernet. The Lanstar tends to be rejected by these users, partially because it is proprietary but mostly because it will not support Ethernet-equipped workstations, and many advanced-function workstations come with integrated Ethernet adapters. The technology-based users do not want as extensive an effort made for them as do the solution-based users. The university has been trying to set up a structure to support these users. There are major groups supporting advanced-function workstations and high-speed networking and projects with major vendors, including IBM and MCI Communications Corp. These projects have been undertaken by the information technology division to place the university on the technological frontier. The Center for Information Technology has also been established to facilitate these efforts. The Institutional File Server is a joint project with IBM to develop a facility in which large IBM mainframes can be used as secure, central and distributed file servers in a multivendor academic environment. The project will also build in file access work that is being developed at Carnegie-Mellon University. It will provide services in a Unix environment using Sun's NFS and for Macintosh using Apple's AFP protocols, so that Unix and Macintosh users will access the network in a consistent manner. The campus fiber backbone uses fiber optics and cable provided by Northern Telecom. Its goal is to create a high-speed network using the emerging Fiber Distributed Data Interface (FDDI) protocol. We are working with several vendors with the objective of providing the first large operational multivendor FDDI network. One of our biggest challenges is taking the results of these projects, which appeal greatly to the technology users, and ensuring that they are of use to the solution users. By Stewart H. Robinovitz; Robinovitz is a telecommunications specialist at the University of Michigan's Office of Administrative Systems and Center for Information Technology Integration. <<<>>> Title : A directory of 1988 featu Author : CW Staff Source : CW Comm FileName: 88intro Date : Jan 16, 1989 Text: Computerworld presented a variety of regular and special features throughout 1988, covering a wide range of managerial and technical topics. In Depth offered 66 stories, written predominantly by industry experts and treating such subjects as benchmarking, executive support systems, Unix vs. Systems Application Architecture, Integrated Services Digital Network (ISDN) and quantum computing. Several In Depth stories were written by CW news staff members, including three profiles of leading-edge MIS organizations: Du Pont Co., J. C. Penney Co. and American Express Co. In November, our microcomputer reporting team scoped out MIS' personal computing options for 1989. Product Spotlight analyzed 21 product categories in the buyer's guide section. The areas covered ranged from database management systems to IBM's Systems Network Architecture market to PC graphics. In 1988, Executive Report examined 19 issues of particular relevance to MIS management, targeting seven vertical markets _ health care, airlines, manufacturing, education, banking, government and retail _ as well as large networks and PC networks, Transmission Control Protocol/Internet Protocol, ISDN, the Bell operating companies and systems integration. Computerworld Focus presented 11 issues last year, specifically covering different aspects of communications/connectivity, software, departmental computing, security and personal computing. The 13 special features offered by CW in '88 included the first annual Premier 100, three Computerworld Extras, the annual salary and job satisfaction surveys and hardware roundups. Reports on PC buying, the corporate Macintosh and computer careers for graduates rounded out the specials lineup. In 1988, 16 section features were presented, primarily covering management topics. This index will help readers locate articles that were published in Computerworld in 1988. To order a back issue, call or write the Back Issues Department, P.O. Box 9171, Framingham, Mass. 01701-9171, 508-879-0700, ext. 371. The fee is $2 per issue plus $1 postage and handling, prepaid by check to Computerworld. Please note that issues published on Jan. 4, Jan. 25, April 25 and July 18 are no longer available. For multiple reprints of individual articles, call or write to the Rights and Permissions Department at the above address and phone number, ext. 304. Reprints are available on 8 - by 11-in. paper in quantities of 100 or more. <<<>>> Title : Clearing the air Author : Michael Pepelea Source : CW Comm FileName: pepeletx Date : Jan 16, 1989 Text: The benefits of data normalization in creating effective data structures are easily recognized by most database administrators. But many MIS executives have misconceptions about data normalization, and these misconceptions prevent them from supporting it within their organizations. Considering the background of many MIS executives, it is easy to understand why misconceptions develop. Many of those MIS executives achieved success because of their past technical superiority. But these skills were developed when life in the MIS world was much different than it is today. Flat files were the technology of the day. Only a few database management systems were available. Data sharing and distributed data structures were not major issues. Mainframe applications were the rule, and portability of data and connectivity were theoretical issues. Many MIS executives still view data issues from the perspective of that environment. If a database administration hopes to gain MIS executive support for data modeling, it must address some of these misconceptions. MIS executives will then be more inclined to support a data normalization approach. One of the first misconceptions relates to systems design. It holds that normalization is a process engaged in by technical people with little regard for business issues. Therefore, data normalization does not address business needs and offers no benefit to the system design process. Nothing could be further from the truth. Database administration must point out that the first thing the data normalization process does is define the way in which users will access the data. These are called ``user views'' or local views and are obtained by reviewing user-defined screens and reports and investigating data flow diagrams, which should be defined with user assistance. Data normalization does not exclude business needs but is based on user requirements. Another common design misconception of data normalization is that it is unnecessary because structured analysis and design techniques define data relationships. Structured analysis and design are essential to good system design. They must be done to define processing requirements. However, structured analysis focuses on processes, not data. Structured analysis will provide data relationships and passage of data from process to process. That macro view does not define the interrelationships of data element to data element, which is the micro view. It is this view that is needed by database administration to construct databases to support the new system. This view of data can only be achieved through a rigorous data normalization effort. Another misconception is the physical database implication of normalized data. Key misunderstandings are that data structures produced from normalized data models contain too many segments or sets, have large access paths, require huge I/O operations and require complex key relationships. Thus, the databases generated by the data normalization process will perform poorly and be difficult to work with. While it is sometimes true that normalized data structures may contain more segments than the traditional methods, this can be an enhancement rather than an inhibition to performance. Because the user views define data access requirements, data needed by a portion of the applications centers in well-defined segments. This means that the application does not have to access data that is scattered all over the database. Thus, performance is better in data-normalized structures because I/O is better controlled and defined. With normalized data structures, data and multiple access paths are defined only once. Data redundancy is reduced in comparison with traditionally designed data structures, eliminating the need for special maintenance routines to keep data synchronized. Furthermore, less updating takes place for singularly defined data than data defined in multiple ways. Thus, system performance is enhanced. Easier development Last, because normalized data structures are neatly organized and data is organized by access requirements, program development is easier than it is with traditionally developed databases. Programmers need to access fewer data segments because program functions address logically grouped data. Also, because data relationships and key requirements are defined up front, programmers are not faced with ``surprise'' data needs that may require reprogramming and reworking of the database. If these misconceptions of data normalization can be addressed, the only issue remaining is technology. Some MIS executives still feel that data relationship problems can be corrected with physical techniques in the DBMS. Many MIS executives believe that rigorous data normalization is not necessary because the physical aspects of a DBMS will allow the database administrator to fix data relation problems. This belief may be partially true in the traditional hierarchical, networked or inverted file structures. But in a relational environment, failure to properly define data relationships can cause serious problems. Database development for a relational environment must be different. In the relational world, much of the access to data, how indexes are established and how access paths are optimized is done under the covers. The database administrator or application programmer does not have the same level of physical control as he did with past DBMSs. Rather, the basis for control in a relational environment is the relationship of one data element to another. The way data is related dictates access and performance. The way to achieve this relationship is through a rigorous data modeling exercise. By Michael Pepelea; Pepelea is a manager in the IS consulting practice of Ernst & Whinney in Chicago. <<<>>> Title : Doing more with less Author : CW Staff Source : CW Comm FileName: zucchini Date : Jan 16, 1989 Text: ``I'm not everybody's favorite person around here,'' confided Michael R. Zucchini, chief information officer at Fleet/Norstar Financial Group, Inc., as he stepped from his red Jaguar sedan toward Fleet/Norstar's data center in Providence, R.I., earlier this year. It is not that Zucchini is unlikable; he was referring to fallout from his consolidation of data processing and bank operations after Fleet Financial Group, Inc. merged with Norstar Bancorp, Inc. a year ago. Since then, Zucchini has supervised 225 system conversions and the closing of seven data centers, including the one in Providence. The actions resulted in the elimination of 150 jobs, about half of them layoffs. ``It's been a year where every skill I've been able to develop over the years has been needed,'' Zucchini said in a follow-up conversation last month. Zucchini, 42, grew up in New York's Bronx borough, earned undergraduate and graduate business degrees at Pace University, served in the Army and became a programmer. More recently, he has bred race horses as a hobby and served on President Reagan's Private Sector Survey on Cost Control, known as the Grace Commission. Fleet Financial Group hired Zucchini as executive vice-president and CIO in August 1987, four months before completing the anticipated merger with the Albany, N.Y.-based Norstar. Lineage Zucchini came to Fleet from Stamford, Conn., reinsurance company General Re Corp., where he was president and chief executive officer of the DP subsidiary. In 1987, he won an award from Carnegie-Mellon University for General Re's Confer system. The system provides underwriters, sales staff and customers with data on accounts and reinsurance policies previously stowed in a warehouse. In 1986, Zucchini patented an outlet for portable computer terminals. With assets of $26 billion, Fleet/Norstar ranks 23rd among the nation's bank holding companies. Brokerage firm Smith Barney, Harris Upham & Co. ranked it as the top performer among regional U.S. bank holding companies for the third quarter of 1988. With 19,000 employees, seven banking companies in four states and a total of 1,000 offices in 40 states and London, it is a leader among the growing breed of ``superregional'' banks. Zucchini is CEO of Fleet/Norstar Services Corp., a 3,300-person subsidiary handling bank operations and DP. He is also CEO of Fleet/Norstar Business Data, Inc. in Newington, Conn., a DP service bureau formed in October, and is responsible for DP at Fleet/Norstar subsidiaries that provide processing for life insurers and issuers of student loans and mortgages. In contrast to the consolidation in banking, Zucchini is decentralizing some of Fleet/Norstar's nonbanking DP. The company has completed a data center for Fleet Real Estate Funding Corp., a Columbia, S.C., mortgage firm that had been supported by the corporate data center. The new center provides more flexibility and greater economy, since redundancy features make use of the bank center more expensive. But Zucchini has devoted most of his attention to banking. He has eliminated eight mainframes in closing the seven bank data centers and suspended use of three DP service bureaus. The consolidation has cut annual DP expenses for the banks by more than $10 million, or 13%, and the banking DP staff by 25%, Zucchini says. Banking DP was consolidated in a former Norstar data center in Albany with an IBM 3090 Model 500E. Fleet/Norstar is standardizing on the best of Fleet and Norstar systems and in some cases new ones, generally using in-house staff. Systems and programming staffs remain decentralized and have been reduced only through attrition. Fleet/Norstar offered different positions to some DP employees whose jobs were eliminated and tried to be generous toward laid-off workers ``both in emotional and financial support,'' Zucchini says. Zucchini helped minimize layoffs by taking advantage of attrition _ shifting vulnerable employees into vacated positions to be maintained and filling jobs slated for elimination with temporary consultants. A sense of balance Zucchini's handling of layoffs showed sensitivity but also a ``strong sense of reality'' required to address company needs, says Dave Sheppard, former Fleet DP chief and current co-chief operating officer of Fleet/Norstar Services. In addition, Zucchini learned the banking business very quickly, Sheppard says. ``He has a tremendous amount of energy and a tremendous appetite for information.'' J. Terrence Murray, the Fleet/Norstar president who hired Zucchini, says the company was attracted by Zucchini's organizational abilities more than his technical knowledge. He says Zucchini maintained day-to-day operations while moving forward with the restructuring. ``Mike was successful because he was very focused on his goal and constantly moving toward it,'' Murray says. Bob Drum, formerly Norstar DP chief and now co-chief operating officer with Sheppard, says Zucchini has ``a way of getting to the nub of things,'' and a knack for generating agreement on objectives. ``I think Mike is probably one of the best communicators I've ever encountered in 27 years in this business _ in listening as well as talking,'' Drum says. ``I consider myself to be pretty good manager, and I've learned a lot of things from Mike.'' Zucchini says the merger has given him more sensitivity to the need ``to work through people and not around them.'' Given the potential setbacks employees face in a restructuring, a successful leader needs to get them to see the value to the organization of doing things right, he says. Zucchini still faces 35 to 50 system conversions, but he is looking forward to spending more time with his wife and daughter on the horse farm they are developing in Chepachet, R.I., which is 30 minutes from work, and his townhouse in Providence. He should not worry about having too much idle time. Soon, Fleet/Norstar expects to complete the acquisition of the $1.9 billion Indian Head Banks, Inc. in Nashua, N.H. ``We're beginning to look at it to see what path we should take,'' Zucchini says. By David Ludlum, CW staff <<<>>> Title : A different channel to tr Author : CW Staff Source : CW Comm FileName: videos Date : Jan 16, 1989 Text: MIS, the harbinger of technology for corporate America, is finding itself relying more and more heavily on high-tech solutions to train and educate its devotees. Interactive video, teleconferencing and pay-per-view seminars are practical, usually save money and time and can actually increase productivity, according to companies using them. One of the most visible techniques available for MIS learning is The Computer Channel, Inc., a 4-month-old pay-per-view series of seminars on topics such as networking, databases and OS/2. The program is broadcast via satellite and is geared toward the middle levels of MIS organizations. Each program typically lasts four hours and eliminates what in some cases would be two days spent traveling. A decided advantage of the live television broadcast is its question-and-answer sessions, during which the audience can participate in the conference and share opinions or questions with the speakers over the telephone, subscribers to the show said. At the Marriott Corp. in Bethesda, Md., many methods of video training are used, including videotapes and The Computer Channel. Both are cost-effective, said Larry Bryan, technology advancement coordinator for the hotel chain. ``We find that video training is much more cost-effective'' and more consistent than training people who will in turn have to travel to other sites to train others, he said. Furthermore, there can be even more cost savings through satellite conferencing because there are no production costs like those associated with creating in-house videos, Bryan said. ``We're counting more on the compelling information to be the focus of the broadcast, not the production that you have to have on videotape to maintain audience appeal and interest,'' he said. Some companies are using a combination of many technologies, including television. The Memphis, Tenn., headquarters of Federal Express Corp. is also the headquarters of FXTV, an in-house television network that is used at 700 Federal Express sites throughout North America. Dave Flanagin, senior manager of video systems development at Federal Express, said the company's FXTV system was such a hit when it was implemented a year ago that it outstripped its projected schedule and is being used to broadcast programs on a daily basis. While FXTV has the ability to be used for MIS training, Flanagin said the MIS department is more likely to use the available interactive video training. Flanagin said that FXTV has essentially replaced both a printed publication and videotapes within the corporation: ``We used to distribute something like 500 videos a year. Now we do almost all of that consolidated over FXTV.'' Flanagin said FXTV, which has been in place for a year, is expected to justify its cost within five years of its inception. Using a similar approach is J. C. Penney Co., at which the latest in video technology is not limited to training and MIS. There, the buyers for the retail stores simply watch the company's TV network to select product lines. Another alternative is interactive videoconferencing capabilities, such as those in place at Harris Corp. in Melbourne, Fla. The company uses this capability to hold nationwide product releases from its home base to hotels across the country and to hold meetings with sales representatives or other Harris officials throughout the country or the world. The Harris system continues to grow as a viable method to train people and trim travel costs, said Donald Wilkof, manager of corporate marketing support at the Customer Briefing Center. One time, when Harris Chief Executive Officer John T. Hartley was scheduled to speak at a meeting of Harris' European sales representatives, he ran into a scheduling problem, Wilkof said. So he walked the short distance from his office to the teleconferencing room, where he spoke with the sales representatives via satellite. ``Within an hour and a half, he was back in his office rather than going to London'' and spending two days traveling, Wilkof said. ``It's very cost-effective.'' As the old saying goes, though, you can't please all of the people all of the time. Some people really enjoy traveling, Wilkof said, but there may be a way to resolve that problem. ``We were even thinking of offering 1,000 frequent-flyer miles'' for people who agreed to conduct teleconferences rather than travel, he said. By Alan J. Ryan, CW staff <<<>>> Title : Bankrupcy rewrite gives u Author : Lee Gruenfeld Source : CW Comm FileName: gruen Date : Jan 16, 1989 Text: At last there is some potential good news for users of licensed software. It appears that Congress has finally gotten around to acknowledging that there may be a difference between an application package and a novel. In October, Senate bill S. 1626 amended the Bankruptcy Code in a way that may offer better protection to licensees or customers whose vendors become bankrupt. The original problem, which extends well beyond the concerns of software users, is complex but can be reduced to its essentials with a bit of oversimplification. It is common practice for software vendors to withhold source code from their customers. Generally, if the customer does not intend to modify the programs and instead relies on the vendor for upgrades, this is an acceptable arrangement as long as the vendor remains financially viable. But what does the customer do if the vendor becomes unable to continue support of the product? The traditional mechanism is to set up a third-party escrow arrangement for the source code. Contract provisions specify those conditions under which the vendor will turn over the source code. Examples include failure to issue timely upgrades or inability to correct bugs. Basically, the escrow agreement, like the entire license agreement, is considered to be an ``executory contract.'' But under bankruptcy law, the trustee of the bankrupt company can simply reject an executory contract _ and that is the end of the escrow agreement and the customer's rights to the source, no matter how diligently the customer may have complied with all the terms of the contract. No amount of contract language, no matter how clearly it expresses the intent of the vendor to turn over source code, can protect the customer. The legal journals are filled with clever strategies created by frustrated attorneys to try to protect their clients from this occurrence. As stated above, the worst part is that there have been no actual test cases to use as models. Furthermore, the very right to even use the software itself is rarely dealt with. The camel's back The final straw prompting Congress to fix the problem was the case of Lubrizol Enterprises v. Richmond Metal Finishers in the 4th U.S. Circuit Court of Appeals. Richmond had licensed a proprietary metal coating process to Lubrizol on a nonexclusive basis. After Richmond went bankrupt, it decided that it could increase the price of its coating process if it took away Lubrizol's rights to it. Lubrizol had done nothing to violate its contract but, shockingly, the 4th Circuit allowed Richmond to reject the license agreement for no other reason than to increase the value of the technology to the bankrupt company's estate. The implications for software escrows were profound. Here was a case where the customer (Lubrizol) already had the technology in hand, paid for it and now was prevented from using it. The cavalry charges Enter Sens. Dennis DeConcini (D-Ariz.) and Howell Heflin (D-Ala.) and the Intellectual Property Bankruptcy Protection Act of 1988. The act is actually an amendment to the Bankruptcy Code and is designed to address those situations where there is ``an executory contract under which the debtor is a licensor of a right to intellectual property.'' Specifically, it is expected that software licenses will be considered executory for purposes of interpreting the code. Essentially, the new rules give the customer two options in dealing with rejected licenses. The customer can either consider the contract to be terminated or retain his rights under the contract. In both cases, the licensee may make a claim for damages caused by the rejection. The trustee of the bankrupt company will be prevented from interfering with the customer's right to obtain the source code (for example, from a third-party escrow holder) if that right was written into the software contract. Even better, the customer does not have to wait for the trustee to reject the contract but can request that the materials be turned over immediately. It is good news indeed. But, as usual, things are not as simple as they sound. The new rules will require very carefully worded contracts to ensure that the desired effect is manifested. None of this has any impact on other problems of escrow arrangements such as the need to ensure that the source code actually matches the object code _ a remarkably widespread and underestimated issue. The new rules open a great window of opportunity but are no cause for relaxation. By Lee Gruenfeld; Gruenfeld, a management consulting partner in the Los Angeles office of Touche Ross & Co., specializes in strategic information systems planning. <<<>>> Title : AT&T makes up its mind in Author : CW Staff Source : CW Comm FileName: gteisdn Date : Jan 16, 1989 Text: NEW YORK _ Determined to shed its aura of indecision and marketing ineptness, a seemingly revitalized AT&T has entered 1989 with a flurry of announced partnerships with, and planned purchases of, communications competitors. AT&T last week announced that it had formed a joint venture with GTE Corp. that reportedly will result in the sale of GTE Communications Systems' central office transmission equipment to AT&T. Two weeks ago, AT&T launched an aggressive bid to purchase Paradyne Corp., a once-successful data communications equipment manufacturer that came upon hard times after pleading guilty to charges that it conspired to defraud the U.S. government [CW, Dec. 26]. Two months ago, AT&T said it would purchase Harris Corp.'s Pactnet satellite ground station technology, which is used to provide satellite-based networks [CW, Nov. 28]. The acquisition included a line of very small-aperture terminal earth stations already marketed by AT&T. AT&T-IBM gateway Later this month, AT&T is expected to unveil an alliance with Cincom Systems, Inc. The agreement is intended to provide a gateway between AT&T's Unified Network Management Architecture and IBM's competing product, Netview. Last week's GTE announcement revolves around AG Communications Systems Corp., a joint venture first announced in July that was designed to bring Integrated Services Digital Network and other advanced technology features to GTE's digital telephone switching systems. A related agreement calls for AT&T to license certain 5ESS Switch and related technologies to AG. Financial terms of the deal were not disclosed, but the partners did reveal that GTE will initially hold a 51% interest in the firm, with AT&T eventually assuming 100% ownership after 15 years. Headquartered in Phoenix, the new company is composed of the network-switching business of GTE Communication Systems. All GTE Communication Systems' assets and its 4,900 employees have been absorbed into AG, including the Phoenix-based headquarters, principal research and development complex and manufacturing plants in Illinois. Jack M. Kirker, former president of GTE Communication Systems, heads up the jointly owned company as chief executive officer and reports to a board of directors representing GTE and AT&T. By Patricia Keefe; CW staff <<<>>> Title : Female executive bucks tr Author : CW Staff Source : CW Comm FileName: ricoh Date : Jan 16, 1989 Text: SANTA CLARA, Calif. _ Almost as rare in Japan as a woman with corporate power is a woman with a graduate degree in computer science. Hideko S. Kunii has both, and she is leading Ricoh Corp.'s effort to establish a software research center here. Commuting from Tokyo, where she created Ricoh's database management software laboratory, Kunii spends one week per month here. Such voluntary separation from a husband is nearly unheard of in Japan. ``My husband can take care of himself, unlike typical Japanese men,'' she says. Kunii holds out her hands and wiggles her fingers when describing just how few women have reached her rank in the Japanese business world. She says statistics from public corporations there show that only one out of 1,000 managers is a woman, ``and that includes department stores.'' Some Japanese social structures work against bringing women into the workplace. Most glaring, according to Kunii, is the discrepancy between men and women when they are first hired. ``There are two structures for women but not for men,'' she says. When women are hired, they are asked whether they are on a career path or a noncareer path. She says if they choose a career, they are told they may be separated from their families by being forced to move at the company's request. While Kunii is hardly traditional, she is swayed by the company's bottom line. ``Often, we train women who then quit to have a family. I can see two sides.'' Kunii, who studied for her Ph.D. at the University of Texas in Austin, is attempting to recruit U.S. engineers for Ricoh's development laboratory. Coming from a country where new hires only get one shot at a job and are placed there for the rest of their lives, she says she may have an easier time building staff here. ``In Japan, industry has a handicap,'' Kunii says. ``They have to educate after hiring.'' There are 100 times more undergraduate students in computer engineering here than in Japan and 20 times more graduate students. Often, she says, computer companies in Japan recruit liberal arts students for highly technical jobs. Another major difference is that in Japan, recruitment of employees from other Japanese companies is nearly taboo. With her partial U.S. schooling, Kunii may take a more global view than some of her Japanese colleagues. ``We're such a uniform country, but communication of information is so worldwide,'' she says. By J.A. Savage; CW staff <<<>>> Title : Gandalf pulls end-user st Author : CW Staff Source : CW Comm FileName: gand777 Date : Jan 16, 1989 Text: WHEELING, Ill. _ Gandalf Technologies, Inc. wants to get out from behind the corporate mainframe. For 18 years, the Canadian firm made a fine living by selling modems, multiplexers and other black boxes. Most of them typically resided in the far corner of the computer room, out of the way and largely unnoticed. They were usually reliable, making Gandalf a low-profile vendor in most large data processing shops. But last year, Gandalf made a break with its solidly successful past. It decided to market a new kind of product that would link personal computer clusters and local-area networks directly to the outside world, bypassing DP's glass house altogether. Gandalf calls the new product series, known as Starmaster, a hybrid network system. ``We're trying to avoid the mind-set that my box is faster than your box,'' said Howard J. Gunn, vice-president of corporate planning at Gandalf's U.S. marketing center in this Chicago suburb. ``We're trying to give the end users the techniques they need to run their business.'' Gandalf's bold product move comes at a time when its own business needs a boost. The company reported flat revenue of $29.4 million ($35 million Canadian) for its first quarter ended Oct. 29 and a $1.13 million ($1.35 million Canadian) loss in the same period. Although revenue surged up to $137.3 million ($163.5 million Canadian) in the fiscal year ended July 31, profits were just $7.9 million ($9.4 million Canadian). Gandalf made an unfriendly bid for the UK's CASE Group plc last January, which the target firm rejected. Gandalf management has taken several steps to hold costs down. These include the November layoff of 90 U.S. workers and the shutdown of the assembly plant here. In its new product strategy, Gandalf has had to reach beyond its usual arena of communications hardware to provide application software, electronic mail and other soft features. Inside the Starmaster cabinet, customers can assemble a combination file server and network node in a fashion similar to the creation of a stereo system from major components. Testing, testing Gandalf claims several large customers for Starmaster, including General Motors Corp.'s Electronic Data Systems Division in Detroit, but many early users are still evaluating the technology, Gunn said. One reason is that Starmaster attempts to support a wide variety of computer architectures while eliminating the need for a remote IBM 3274 cluster controller. ``We don't want to argue the esoteric issues of different architectures,'' Gunn said. ``We're trying to bring performance improvements to all these environments. ``We're focusing on end users because we want to help them in planning their overall network design,'' Gunn added. But, he said, ``We're not keeping the MIS people out of the loop.'' The key to Gandalf's success in marketing its distributed file servers will be the degree to which they support open standards, according to Frank Dzubeck, president of Communications Network Architects, Inc. in Washington, D.C. Until now, many of Gandalf's most successful products have been weighted toward the IBM environment. ``They'll have to recognize the nuances of the marketplace,'' Dzubeck said, ``and right now, the drift in the marketplace is toward standards.'' By Jean S. Bozman, CW staff <<<>>> Title : In brief Author : CW Staff Source : CW Comm FileName: 19week Date : Jan 16, 1989 Text: One more for the new year Apple Computer, Inc. started the new year by corralling additional clout for developers with the acquisition of Coral Software Corp., a Cambridge, Mass.-based purveyor of programming languages and artificial intelligence tools for Macintosh personal computers. Coral currently makes and markets two LISP product lines _ Pear LISP, an entry-level development environment, and the more sophisticated Allegro Common LISP _ as well as Object Logo, an object-oriented version of a programming language widely used in education. Apple's acquisition includes not only the Coral technology but also five key software engineers around whom the new parent plans to develop a Cambridge-based research lab for its Advanced Technology Group. Is eight enough? Ask venture capitalist Thomas J. Perkins. Already the chairman of seven publicly held companies including Tandem Computers, Inc. and Genentech, Inc., the Kleiner Perkins Caufield & Byers partner heads into the new year as chairman _ the company's first _ of Alliant Computer Systems Corp., which hopes to harness his experience to renew its assault on the embattled minisupercomputer sector. Softech takes one giant step Softech, Inc.'s recent purchase of Wakefield, Mass.-based Compass, Inc. from Applied Data Research, Inc. marked the Waltham, Mass.-based company's entry into the high-end compiler and professional services marketplace for supercomputers as well as its first giant step beyond the federal government services arena into the commercial marketplace. The deal also effectively placed Softech in the rarified group of contrarians who have acquired organizations from super-acquisitor and ADR parent firm Computer Associates International, Inc. Oracle keeps on spiraling Oracle Corp. last week reported another quarter bound to disappoint those who speculated that the Belmont, Calif.-based database vendor could spiral no higher. Revenues for Oracle's second fiscal 1989 quarter ended Nov. 30, 1988 came in at $123.7 million _ a 105% increase over last year's second-quarter sales. Net income rose 125% to $17.2 million. <<<>>> Title : Hitachi beefs up U.S. pre Author : CW Staff Source : CW Comm FileName: norman1 Date : Jan 16, 1989 Text: NORMAN, Okla. _ If the oft-rumored sale of National Advanced Systems (NAS) to Hitachi Ltd. comes true, Hitachi will have a ready supply of U.S.-made storage peripherals available for the U.S. market. Hitachi said recently it is more than doubling the size of its storage subsystems manufacturing plant here by the end of 1989. Hitachi is taking advantage of the dollar's weak position against the Japanese yen to manufacture more economically in the U.S. The $10 million expansion of the year-old Hitachi Computer Products (America), Inc. facility is ramping up to produce 7380 JX and K direct-access storage device (DASD) subsystems for NAS mainframes. Previously, NAS obtained those newest DASD models from Hitachi's main plant in Japan, according to a spokesman. The lack of DASDs available for shipment was blamed in part for NAS' poor revenue in the quarter ending August 1988. Prior to the December ramp-up of JXs and Ks, the plant supplied NAS with DASD Model Ds and Es. By J.A. Savage, CW staff <<<>>> Title : Charting MIS course by de Author : CW Staff Source : CW Comm FileName: careers9 Date : Jan 16, 1989 Text: The graduate degree of choice for most MIS professionals is an MBA with a concentration in information systems. An alternative is the master of computer science, but this degree more often lends itself to technical positions in the vendor community. The new kid on the block is the master's degree in MIS, which is offered by only a dozen or so universities, far fewer than the ever-present MBA. Leaders of programs offering the master's in MIS say it was designed for people anticipating a relatively technical career track. In contrast, companies might seek MBA holders for positions in end-user departments in which they would know the work and could apply MIS expertise. However, Mike Lawson, associate professor of economics and chairman of the MIS department at Boston University, notes that graduates of the university's 6-year-old graduate MIS program have been promoted to some management roles. ``Our graduates have already moved up to direct the MIS departments in small and medium-size organizations, and they have also advanced to high positions in consulting firms,'' Lawson says. MIS managers, academics and recruiters agree that the question of which degree to pursue is not based on which is better but rather on fitting the proper academic background to one's career path. Individuals with an MBA tend to land more lucrative posts than those with a master's in MIS. But Lawson says the choice of degrees should hinge on whether an individual wants the broad-based management outlook of the MBA or the more technical one of the master's in MIS. MBAs wanted Georgia Miller, assistant dean of the School of Business at Indiana University in Indianapolis, says that companies seeking to hire MIS employees with a broad business perspective for applications work or entry-level management positions favor MBAs. ``Companies will hire MBAs when they are looking for future leaders of the company, because they tend to have more of a generalist background,'' Miller says. Students joining the Indiana University MBA program on average have three to five years of experience in a general business function. ``We feel that it is very important for students to have business experience outside of their field,'' Miller says. ``MIS is not an area unto itself. The students have to relate it to the rest of the business.'' Warren Harkness, MIS director at Bose Corp., a manufacturer of audio products in Framingham, Mass., says his company hires MBAs with five to 10 years in a consulting environment for the in-house business systems consulting group. These MBAs try to identify solutions to business problems, then review the technical feasibility of the solutions and hand projects over to a technical person, Harkness says. Jim Young, managing director of recruitment firm Source EDP in Philadelphia, says his clients seeking employees with MBAs are generally management and computer consulting divisions of Big Eight accounting firms. ``The ideal academic background [for these firms] is an undergraduate degree in computer science and then an MBA,'' he says. Amy Peal, an analyst/programmer at Pennwalt Corp. in Philadelphia, began studying for her MBA in 1984. She says accounting and finance classes provide her with an understanding of end users' problems when designing systems in those areas. ``As an MBA candidate, I feel I have broader-based business knowledge that makes me more useful to my employer already,'' Peal says. Few grad MIS programs Boston University's Lawson estimates that there are currently fewer than a dozen graduate-level MIS programs in the country. ``The career track in MIS is still developing,'' Lawson says. ``Most universities are still trying to reconfigure their business programs to include MIS courses.'' Boston University takes the novel approach of requiring candidates for master's in MIS to attend the program full-time for 11 months. The university also requires candidates to hold an undergraduate or graduate degree in business. ``It's important for our students to have a grounding in business,'' Lawson says. ``Then when they enter the program, they take a core of business courses that focus on how MIS functions in those areas, along with specialized technical courses in MIS and management classes.'' The master's in MIS program at the University of Arizona in Tucson is more technically oriented than Boston University's and is designed so that students can opt for a part-time or full-time schedule. This degree is geared to positions in data administration, database design and networking, says Jay F. Nunamaker, director of the Arizona program. The greatest demand among organizations recruiting graduates is for telecommunications, Nunamaker adds. In the Arizona program, students take a core of five business courses along with technical MIS courses. They can specialize in several areas, including analysis and design of databases and communications and networking, according to Nunamaker. Lawson says he expects programs offering the master's in MIS to eventually spread to more campuses. ``There's a large demand in MIS departments for people trained in the technical areas of the industry,'' he says. By Janet Mason; Mason is a Philadelphia-based free-lance journalist. <<<>>> Title : Vaccines are no miracle c Author : CW Staff Source : CW Comm FileName: market09 Date : Jan 16, 1989 Text: In the aftermath of November's Internet virus attack, vaccines claiming to detect and cure various types of computer infections have proliferated. An antiviral package vendor reports of a fellow who travels in a mobile home, going on-site to hunt down viruses and administer inoculations in the style of Hollywood's Ghostbusters. As helpful as the vaccines may appear, some industry watchdogs and organizations are condemning some of the products as ineffective and potentially harmful. Del Jones, managing director of the National LAN Laboratory in Reston, Va., has issued a set of guidelines on virus prevention and control endorsed by about 70 manufacturers [CW, Dec. 12]. ``The guidelines specifically exclude vaccines because of the false sense of security they provide,'' Jones explains. ``Once you throw in a vaccine, there's a tendency not to look at management issues, which is the proper way to control the problem.'' Harold Highland, founder and editor of Computers & Security, the professional journal of the International Federation for Information Processing, refers to the newest vaccines as ``kindergarten variety.'' Highland recently commissioned a study of 20 commercially available antivirus products. The results, scheduled to be published early this year, make clear to Highland that the vaccine marketplace ``has a very long way to go.'' Temperamental vaccines The study's investigators encountered temperamental products that crashed systems and inexplicably worked only with particular text editors. Highland says he has seen vaccines that ``are effective against only one or two types of viruses _ though they don't tell you that.'' Highland says some products are apparently marketed without being tested on the viruses they claim to protect against, since captured viruses must be carefully controlled and are not readily distributed for commercial testing. Highland declined to provide more specific findings of the research. Martin King, manager of audit services at Computer Associates International, Inc., agrees with the characterization of some vaccine vendors as carpetbaggers but contrasts their work with the development effort he says went into CA's CA-Examine. Beginning in 1984, King, who has been lecturing on security issues for the past eight years, led a team that developed the antiviral package for IBM MVS systems, which is now in place at about 400 companies worldwide. The software, developed in cooperation with U.S. government agencies and Big Eight accounting firms, took two years to gel, contains 150,000 lines of assembler code and wields more than 100 on-line analysis displays, King says. ``It's serious software,'' he maintains. One vendor that takes pains to acknowledge limitations of its vaccine is HJC Software, Inc., which makes a package called Virex for the Apple Computer, Inc. Macintosh environment. ``Right on the packaging, in the documentation and even when the program comes up, we say, `Your computer is free of known viruses,' '' says Robert Capon, president of the Durham, N.C.-based firm. ``We have a whole section in our documentation in which we emphasize that the use of products like Virex is not an alternative to adopting prudent computing practices, such as obtaining software from reliable sources and backing up,'' Capon adds. He maintains that Virex was in development for nine months and that the firm had planned a Nov. 16 introduction six months prior. ``I was as surprised as everybody when the Arpanet virus hit. The timing was purely a coincidence,'' Capon says, admitting that the outbreak did give his marketing effort a significant boost. Bard White, MIS director at Spaulding Sports Worldwide in Chicopee, Mass., chooses to rely on a vaccine as one segment of his antiviral armor. A year ago, he installed Systemate, Inc.'s Systemate vaccine on 150 IBM Personal Computers and compatibles as part of a security build-up. ``It's like using a wood stove in the winter,'' White says of the vaccine. It provides partial protection and some psychological comfort, he says, ``but at least you're trying to be proactive.'' King echoes this attitude on the vaccine. ``Though there's no perfect silver bullet,'' he says, ``I'd rather have five or six lead ones and do the best I can rather than do nothing at all.'' By Richard Pastore; Pastore is a Computerworld copy editor. <<<>>> Title : Hacker prosecution Author : CW Staff Source : CW Comm FileName: hacker1 Date : Jan 16, 1989 Text: LOS ANGELES _ Deemed a potential threat to society, a suspected hacker has spent more than a month in jail awaiting a late February trial for allegedly defrauding Digital Equipment Corp. and MCI Communications Corp. and transporting proprietary software across state lines. Kevin David Mitnick, 25, from Panorama City, Calif., could spend up to 30 years in jail and face a $750,000 fine if convicted. The U.S. Attorney's Office said it expects to file a new indictment superseding the above three felonies. Mitnick pleaded not guilty to all three charges Tuesday. ``He's being used as a scapegoat for a lot of hacking,'' his attorney, Alan Rubin, claimed last week. Public enemy Mitnick is being held in a high-security area of Los Angeles' Metro Detention Center. ``The court ruled him a risk and a danger to the public,'' said prosecutor Leon Weidman. U.S. District Court Judge Mariana Pfaelzer refused to allow Mitnick access to telephones, as other inmates have, because the U.S. Attorney's Office is afraid that he may have preprogrammed a computer to trigger programs with a phone call, according to Weidman. Rubin attributed that restriction to prosecutorial paranoia, saying, ``They're frightened of people who understand computers.'' In affidavits filed in the case, the Federal Bureau of Investigation is reported to have found an associate of Mitnick's _ Leonard Mitchell Di Cicco, an employee at Voluntary Plan Assistance (VPA) in Calabasas, Calif. _ who said that Mitnick was illegally accessing DEC systems from a VPA terminal. Telephone charges were ``routinely changed by Mitnick from VPA's MCI telephone account to some unknown or nonexistent account.'' After Mitnick was arrested Dec. 9, FBI agent Christopher Headrick, with the help of Di Cicco, found an unauthorized copy of DEC's Security Software System stored in a computer in the University of Southern California in November, to which Mitnick apparently had access. ``Lots of people have access to USC computers,'' Rubin noted. DEC Manager of Investigative Services Chuck Bushey, in part of the affidavit, said that four unauthorized intrusions in DEC's computer system resulted in access to the security software. The intrusions have cost DEC $4 million, according to the affidavit. DEC would not comment on the case. Mitnick was convicted in 1981 and 1983 of stealing technical information from Pacific Bell and accessing USC computers. By J.A. Savage, CW staff <<<>>> Title : U.S. airports to install Author : CW Staff Source : CW Comm FileName: airport Date : Jan 16, 1989 Text: WASHINGTON, D.C. _ In an effort to beef up security, U.S. airports will be required to install computer-controlled card systems that use magnetic identification cards to control employee access to sensitive areas, according to a new regulation from the Federal Aviation Administration. The announcement last week was in response to a December 1987 plane crash in California that killed 43 persons; it apparently was caused when a former airline employee opened fire with a gun during the flight. Investigators suspect the worker used his old airport identification card to board the plane. Computer-controlled card systems can be programmed to keep a record of employees who try to enter areas in which they are not authorized. ``They also can reject cards that have been reported lost or stolen or that have not been surrendered by former employees,'' said Secretary of Transportation James H. Burnley IV. Computer-card or equivalent systems must be installed at the nation's busiest airports by early 1991, according to the FAA regulation. Information is encoded on a magnetic strip or chip in the cards and then scanned by a card reader at entry points. About a dozen airports already have effective computer-card systems, the FAA said, but other airport operators complained that the FAA requirement is too hasty and too expensive, especially for smaller airports. The FAA disagreed, saying the average initial cost per airport will range from $56,000 to $1.5 million, depending on the size of the airport. By Mitch Betts, CW staff <<<>>> Title : Defense project draws fir Author : CW Staff Source : CW Comm FileName: norad Date : Jan 16, 1989 Text: They're not likely to make a movie about this. Five years after the film War Games had audiences on the edge of their seats over the deadly escapades of a computer gone haywire at the North American Aerospace Defense Command (Norad), a government report has concluded that Norad's project to build a replacement computer system is over budget, behind schedule and does not meet critical requirements. Norad, the military's attack-warning center deep within Colorado's Cheyenne Mountain, processes data from a variety of sensors to warn U.S. and Canadian leaders of airborne attacks. The center is in the middle of a $281 million program to replace its aging computer communications system. Trouble, trouble, trouble However, the U.S. General Accounting Office (GAO) reported last month that the project has run into big problems. Software for the replacement system is unstable and prone to failures, and restarting the system after a power loss takes an unacceptable 31 minutes, the GAO said. In addition, the auditors said, the new equipment is incompatible with other Cheyenne Mountain equipment because it uses a different wiring standard; installing new cables will be difficult because of a ``rat's nest of cabling under the floor.'' The U.S. Air Force said it is addressing all of these problems. According to the Air Force response to the GAO report, the contractor must fix the software at no cost to the government and meet the restart requirement of two minutes for critical circuits and 26 minutes for others. Norad computers achieved fame in the 1983 hit movie War Games, in which a teenage hacker tapped into a hulking Norad supercomputer and started it playing a game of global thermonuclear war. A plot twist gave the computer control of the Norad early warning system, placing U.S. missiles on alert. The GAO audit concerned the communications subsystem, whose processors control virtually all the digital communications at Cheyenne Mountain. The Air Force has wanted to replace the subsystem since 1981, mostly because it uses old Data General Corp. Nova 840 minicomputers that are hard to maintain. In 1984, the Air Force awarded the replacement contract to GTE Corp. in Stamford, Conn., but the installation date has slipped from 1986 to 1990, and the price has grown from $202 million to $281 million, the GAO said. DG no longer makes spare parts for the Nova 840 _ a 16-bit mini introduced in 1973 _ so Norad personnel have had to make their own parts or salvage them from retired Novas, the audit said. The replacements are expected to come from Stratus Computer, Inc. Due to delays in the replacement program, the Air Force began a $14 million program of interim upgrades to prolong the old system's useful life, including the replacement of its Honeywell, Inc. 60/60 mainframe with a Honeywell DPS 6000. The GAO said that because these and other interim measures have improved system performance, much of the bogged-down replacement program may be unnecessary. But the Air Force disagreed, saying a complete replacement is needed because the existing software, written in assembler-level languages, is too costly to maintain. By Mitch Betts, CW staff <<<>>> Title : Cincom to unwrap Supra up Author : CW Staff Source : CW Comm FileName: cincom89 Date : Jan 16, 1989 Text: Cincom Systems, Inc. is expected to unveil by March a completely rewritten version of Supra, its relational database management system, industry analysts briefed by the Cincinnati software firm said last week. The announcement of Supra Version 2.0, as analysts called it, may come as soon as the first week in February at a database conference on the West Coast, where Cincom officials are scheduled to participate. Cincom executives would neither confirm nor deny the reports. Some users questioned last week said Cincom told them that Supra 2.0 will run much faster than Supra 1.0. ``Supra 2.0 will run above the 16M-byte line under IBM's MVS/XA,'' one Midwestern user said. ``That's a major performance improvement.'' This user also expects interfaces to major IBM subsystems, including IBM's VSAM, DB2, DL/1 and IMS. Most users have signed nondisclosure agreements and declined to provide details about the announcement. Still, some said they are concerned that Supra 2.0 may be marketed as a new product rather than as a less costly upgrade from 1.0. Tough conversion? There are also conversion worries. Some users were told this fall that Supra 2.0 might not support Cincom's Total, an older DBMS. ``Supra 1.3 gives you a bridge from Total to Supra,'' one user said. ``Supra 2.0 does not have that bridge, so that the Total user is left hanging.'' Still, there will be a migration path from Supra 1.0 to Supra 2.0, according to the Midwestern user. ``Conversion tools will be provided, because the new version has a different file structure,'' the user explained. ``But users should plan to migrate because, in time, the old version of Supra will go away.'' Primary features of Supra 2.0, analysts said, include the following: Support for on-line transaction processing (OLTP) operating systems. New maintenance features will allow software changes to be made without interrupting the central CPU engine. Support for a wide variety of Unix computers and workstations. Initially, support will be provided for processors from Amdahl Corp., Apollo Computer, Inc. and a few others; the second tier of Unix systems, including Digital Equipment Corp.'s Ultrix operating system and Sun Microsystems, Inc. products, is expected later this year. Support for both ANSI SQL and the international SQL standard, which will provide compatibility with IBM's DB2 relational DBMS product. By Jean S. Bozman, CW staff <<<>>> Title : Corrections Author : CW Staff Source : CW Comm FileName: correct1 Date : Jan 16, 1989 Text: The charts accompanying the Product Spotlight on IBM and compatible PC products [CW, Dec. 19] gave an incorrect phone number for Relisys. The correct number is 408-945-9000. Ross Systems was incorrectly reported as holding an approximate 25% share of the Digital Equipment Corp. VAX accounting software market [CW, Nov. 14]. According to market research firm Computer Intelligence, Ross' share of that market increased by 75% between June 1987 and April 1988. The Dec. 26/Jan. 2 Industry Insight column erroneously included Scientific Computer Systems in a list of ``Casualties of 1988.'' The San Diego-based maker of Cray Research, Inc.-compatible minisupers continued to grow in 1988 and completed an $8.8 million round of venture capital financing in the third quarter. <<<>>> Title : HP combats DEC with low-e Author : CW Staff Source : CW Comm FileName: hpintros Date : Jan 16, 1989 Text: SUNNYVALE, Calif. _ Apparently vying for attention with Digital Equipment Corp., Hewlett-Packard Co. today is scheduled to introduce a bevy of products and announce price cuts. Two high-end deskside personal computers based on Intel Corp.'s 80386 processors, an entry-level three-dimensional graphics workstation and a Zenith Data Systems Corp.-based laptop are scheduled for introduction today by HP, company officials confirmed last week. Additionally, the company plans to announce price cuts on three of its workstations, cutting the cost of 6-month-old Model 360SRX nearly in half. It is no coincidence that HP's announcement coincides with DEC's desktop barrage this week (see story page 1), indicated John Logan, vice-president of the Aberdeen Group in Boston. ``HP is jumping up and down to be compared to Digital,'' he said. ``The company is realigning its price/performance to look better than DEC.'' Logan said he also expects HP to release benchmarks against DEC this week that ``look intriguing.'' Target: DEC Hitting DEC directly, HP now offers a software package that converts VAX applications written in Fortran to be used on HP's Series 800 reduced instruction set computing (RISC) Unix-based machines. The high-end PCs are the upper extensions of HP's Vectra line. The RS/25 uses Intel's 25-MHz processor and its 82385 cache controller. The Model RS/20 uses a 20-MHz chip. Because of the cache controller, the computer will run about 40% faster than earlier models on memory-intensive applications, according to Marlin Miller, HP product manager. HP has carried an RS/20 since February 1988 at prices similar to the new models', which begin at about $7,500. While the older models had more main memory at the low end _ 2M bytes on the Model 150 vs. 1M byte on the new Model 150E _ the old models had no cache controller. HP is set to make announcements in several product categories: Laptops. Capitalizing on the popularity of Zenith's laptop, based on an Intel 80286 processor, HP said it is now retailing the Zenith machine under its logo with only slight additions in software. The laptop, called the Vectra LS/12, has either a 20M- or 40M-byte hard disk at prices slightly higher than Zenith's, $4,879 and $5,479. Workstations. Expanding its line of Motorola, Inc. 68030-based workstations, HP announced a low-end graphics workstation priced at $14,900. The Model 340SRX competes directly with Silicon Graphics, Inc.'s Personal Iris, introduced in October, and will cost $15,000. Price reductions are also set to be announced today on three of HP's Motorola processor-based and HP's own RISC architecture workstations. A 42% price reduction will be granted to the mid-range 68030-based Model 360SRX, knocking the price from $34,965 to $19,900. The high-end Model 370SRX is due to get a 17% reduction to $41,000. Both were introduced midyear 1988. VAX conversion software. A $10,000 conversion program for VAX applications written in Fortran to HP 3000 Series 800 multiuser RISC-based Unix computers is scheduled to begin today. The software tools alone can be obtained for $2,000, according to a spokesman. By J.A. Savage, CW staff <<<>>> Title : Feds to track suspect doc Author : Mitch Betts Source : CW Comm FileName: quacks Date : Jan 16, 1989 Text: WASHINGTON, D.C. _ The U.S. government is creating a nationwide database to help hospitals and medical licensing boards keep track of quacks. The U.S. Department of Health and Human Services awarded a $15.9 million contract to Unisys Corp. Dec. 30 for the computer system, which the agency said is ``intended to lessen the possibility that incompetent physicians and dentists may move their practices from state to state without detection.'' The National Practitioner Data Bank was authorized by the Health Care Quality Improvement Act of 1986 and is expected to be operational this summer. The database _ consisting of disciplinary actions, peer-review reports and malpractice claims _ will provide a screening tool for hospitals and other organizations that hire doctors. Members of Congress praised the department for implementing the law. ``In the past, incompetents have been able to slip through the cracks and inflict poor medical care on the American consumer,'' said Rep. Ron Wyden (D-W.Va.). Access to the database will be limited to hospitals, group medical practices, state licensing boards and individual doctors. The public will not have direct access, due to privacy safeguards, but consumers are the ultimate beneficiaries of the system, said spokesman Frank Sis. MITCH BETTS <<<>>> Title : UK bank invests in AS/400 Author : CW Staff Source : CW Comm FileName: citi Date : Jan 16, 1989 Text: HFC Bank PLC in London is forking over $20 million to IBM and Citicorp Information Resources for a banking system based on 190 IBM Application System/400s and 1,500 IBM Personal System/2s. The joint sale by IBM and Citicorp was announced last week. By 1990, HFC said it expects to have implemented the AS/400-based network, which will run the Citicorp Comprehensive Banking System. While this is not the largest single sale of AS/400s, it is the largest sale of the mid-ranges intended for a single private network, an IBM spokesman said. The distributed system will replace an aging IBM mainframe system that ran banking software designed by HFC. According to a published statement, the bank will scrap its batch-oriented software and remove the mainframe to make way for the AS/400s. HFC conducted a two-year study of banking systems and tested the Citicorp banking software before signing the contract, the companies said. The system is part of HFC's strategy called Project 90, an aggressive plan to increase its customer base from the current 300,000 to one million accounts during the next several years. The goal is to have a PS/2 for nearly every bank employee who deals directly with customers. The PS/2s will be linked to the AS/400s, which will be located in the bank's 172 branch offices in the UK. The Citicorp software was recently tailored to take advantage of the AS/400 architecture, Citicorp said. The heart of the system is a database called Common File, from which all the banking applications run. Citicorp will set up a link for diagnostic purposes from its headquarters in Orlando, Fla., to HFC facilities in Windsor and Birmingham, England. <<<>>> Title : HP, DEC will try to tarni Author : CW Staff Source : CW Comm FileName: as4002 Date : Jan 16, 1989 Text: IBM's Application System/400 has been wildly successful, with users buying the machines as fast as they can be built. But opinions differ on whether the honeymoon will extend past midyear, when the mid-range offering will face its first taste of real competition from rivals Digital Equipment Corp. and Hewlett-Packard Co. According to a study done by ADM, Inc. in Cheshire, Conn., IBM sold 32,000 systems in 1988. Those sales translate into approximately $5 billion in revenue, with nearly half that amount derived from the European market. Worldwide AS/400 installations are projected to increase to 98,000 in 1990 and 125,000 in 1991. ADM predicts a good 1989 for the AS/400. The firm justified this prediction by citing the continued pent-up demand within the IBM System/36 and 38 marketplace, the refinement of U.S. sales channels and the completion of the 1988 sales cycle for organizations that began looking at the system during the latter half of last year. Not so flowery But not everyone sees such a rosy picture for the IBM system. John Logan, executive vice-president of the Aberdeen Group in Boston, claimed that by midyear, IBM will be forced to engage in a toe-to-toe battle with HP's Spectrum 9000 series and DEC's symmetric multiprocessing machines. According to Logan, such contests will result in a leveling off of AS/400 sales, largely because the machine has a relatively poor reputation for performance, specifically when called on to run two or more applications. ``When the industry moves toward comparative analysis based on application solutions and price/performance, the AS/400 will find itself on shakier ground,'' Logan said. However, ADM President David Andrews said he believes that once the pent-up demand falls off, power boosts to the AS/400 for both the larger and smaller models will keep sales moving. This year, Andrews said, IBM will add functionality to allow the AS/400 to serve as a control point for intelligent workstations; in 1990, IBM will bring artificial intelligence, voice and data integration and image processing to the platforms. Logan attributed the success of the AS/400 at least in part to timing. He said the industry is experiencing a turnover in mid-range computers that were purchased in 1984 and are now outdated in office functionality and not worth maintaining. The first replacements affect the old IBM System/34, 36 and 38 as well as the DOS/VSE marketplace. According to Andrews, at least half the DOS/VSE users will migrate to the AS/400 because rewriting an application on the AS/400 is less expensive than porting it to the MVS operating system. By Robert Moran, CW staff <<<>>> Title : DB2 information-sharing g Author : CW Staff Source : CW Comm FileName: db2users Date : Jan 16, 1989 Text: CHICAGO _ A nationwide DB2 users group was incorporated here late last month in an effort to share information about IBM's relational database management system that is being gathered separately by more than 25 DB2 users groups throughout North America. Called the International DB2 Users Group (IDUG), the group's first aim is to hold a conference here in August. The group is overseen by the Chicago firm of Smith, Bucklin & Associates, which already organizes the Guide International Corp. and Share IBM users group meetings. But organizers say IDUG will not supersede existing DB2 users groups _ or even DB2 user committees within Guide and Share. ``We're not going to try to submit DB2 requirements to IBM like Share and Guide do,'' said IDUG President Bill Backs, who is also director of information technology at Scott, Foresman & Co. here. ``They're very good at submitting DB2 product requirements, and we don't want to dilute that role.'' Instead, IDUG intends to give DB2 users and software vendors a single organization through which they can share DB2-specific information. ``We're trying to form an organization that encompasses all the people connected with DB2,'' Backs said. ``We are by no means trying to preempt the regional groups _ but we would like to facilitate communications between them.'' DB2 users groups have sprung up independently in all parts of the country since IBM introduced DB2 in 1985. The largest of these groups meet in New York, Chicago, Los Angeles, San Francisco, St. Louis and Toronto, and most draw 150 to 300 attendees per meeting. All are welcome IDUG will be managed by and for users, but vendors of DB2-related products and utilities are invited to join. ``We want to acknowledge that vendors have a place in the user community,'' Backs said. ``However, we're establishing a separate membership category for vendor members and limiting vendors to two of eight seats on the IDUG board of directors.'' Two Chicago-area DB2 vendors, Platinum Technologies and Peat Marwick Advanced Technology, gave IDUG approximately $30,000 in start-up funds, Backs acknowledged. But software vendors, as well as users of IBM-compatible mainframes, have traditionally been unable to attend Guide and Share meetings, Backs said, because neither group could claim to own an IBM mainframe. Leaders of regional DB2 groups seemed surprised by the creation of IDUG but are interested in learning more about it. ``I'm taking a wait-and-see attitude,'' said Joyce Bischoff, chairman of the Delaware Valley DB2 Users Group in Philadelphia. ``I'm in favor of good communications at any time, but up to now I've thought that Guide and Share were the ideal places to exchange DB2-specific knowledge.'' Others were optimistic. ``If IDUG gets off the ground, it'll be of great benefit to the user community,'' said Bruce Gallagher, president of the Greater New York DB2 Users Group. ``And if IBM sees that IDUG represents its DB2 customer base, there is even the possibility for leverage with IBM.'' The next few months should determine whether the concept of pulling together a single DB2 forum in the U.S. and Canada will work. IDUG plans to complete its mailing to 10,000 DB2 users this month. Fees from the summer conference would, in part, keep IDUG rolling into 1990. By Jean S. Bozman, CW staff <<<>>> Title : Smooth start Author : Nell Margolis Source : CW Comm FileName: stock010 Date : Jan 16, 1989 Text: Chalk it up to post-holiday mellowness: Good news moved tech stocks last week, while other news basically left them in place. Digital Equipment Corp., on the eve of eagerly awaited workstation announcements, broke 100; DEC closed on Thursday at 100 , up 1 points from the week's starting price of 98 . Oracle Corp. announced yet another quarter of upward-spiraling sales and profits and saw its stock rise 1 points to 20 . Apple Computer, Inc. picked up a technology research power base in Cambridge, Mass.; its stock picked up 2 points, closing on Thursday at 42 . On the other hand, in the wake of a reported net loss and diminished revenue for its second quarter, On-Line Software International, Inc. stock closed where it opened the week at 5. Continental Information Systems Corp. also held at 2 after announcing a planned employee reduction that will bring the firm's recent job-cut total to 300. Wyse Technology announced a 15% work force reduction and closed at 6 , down of a point. Alliant Computer Systems Corp. announced a work force reduction of almost 20% and closed at 4 , down of a point. NELL MARGOLIS ET <<<>>> Title : `Service' redefined Author : Laura O'Connell Source : CW Comm FileName: trends19 Date : Jan 16, 1989 Text: Customers are redefining the term ``service,'' and vendors have a tall order to fill. Four or five years ago, service referred almost exclusively to hardware maintenance. Now that hardware has become so reliable and service prices have dropped, users seek more than repairs. Contributing to changes in customers' view of service is the shifting landscape of the user environment. Use of multivendor systems, networking and remote services is rising. Technical competence among users is growing. And MIS departments are taking on new roles: as contractors to develop and implement user applications; as vendors to provide installation, training and support; and as utility companies to offer telephone and transmission services. The Ledgeway Group, Inc., a Lexington, Mass., market research firm specializing in the service industry, asked 45 MIS executives at Fortune 1,000 firms what they want from service providers. They found service is now a crucial factor in the product-buying decision. Quality of service and support lands in the top three factors influencing the executives' first-time and repeat purchases. Customers also want vendors to understand them _ not only their business practices but, more important, their technical environments. Vendors, they say, have proven uncooperative in multivendor settings. LAURA O'CONNELL ARCW chart 1, MIS' criteria for service vendors, source: The Ledgeway Group, Inc.; CW chart 2, Evaluating the relationship, source: The Ledgeway Group, Inc.; CW chart 3, Shopper's priorities shift between initial and repeat purchases, source: The Ledgeway Group, Inc. <<<>>> Title : Inside lines Author : CW Staff Source : CW Comm FileName: liner19 Date : Jan 16, 1989 Text: A Prime opportunity? Novell is expected sometime during the next four weeks to make a joint announcement with Prime, which is currently beset by a takeover campaign from MAI Basic Four. Sources close to both companies say to expect a Unix port of Netware. Last month, Novell cited Unix support as one of its goals for 1989, and beleaguered Prime is said to be looking for an entry into the PC LAN arena. Network consultant Frank Dzubeck suggests that Novell may wind up with a Unix port that is more specific to the Primos operating system rather than a generic product that will sing under many AT&T Unix System V-based operating systems. Ask not for whom the Bell tolls. While DEC President Ken Olsen last fall was publicly denying published details behind the rumors of AT&T courting DEC as either a merger partner or takeover target in 1985, AT&T apparently came a-courting again, according to sources within DEC. With DEC's stock price wallowing in the $85- to $90-per-share range, AT&T saw DEC as an attractive target once again. Sources say DEC, on the very day of AT&T's approach, initiated a 10 million-share buyback of its own stock to bolster the price and fend off the suitor. Instead of responding to the published reports of the 1988 rumors, Olsen attacked, blasting The Ultimate Entrepreneur, a book by Computerworld editors Glenn Rifkin and George Harrar. Don't look for DEC equipment with that speaker wire. Although Tandy has made the PCs that will debut at DEC's desktop announcement Tuesday, they will not be sold at the neighborhood Radio Shack. A Radio Shack spokeswoman said some subtleties exist that differentiate the DEC and Tandy machines. But for the most part, she said, the machines are ``Digital's in name only.'' Speaking of channels . . . Sales of the Tandy 5000, the company's Micro Channel Architecture-compatible PC, have exceeded expectations, said Ed Juge, director of marketing. Juge said that although Tandy is trying to keep up with back orders, sales are small in comparison with IBM PC AT-type system sales. But he noted that Tandy has ``a hell of an advantage'' over other clone vendors competing against IBM's locked-up distribution channels because the Tandy 5000 can be found at the neighborhood Radio Shack. Buyouts and takeovers and all that jazz. Computer industry investment banking, mergers and acquisitions firm Broadview Associates' first contribution to the Bush administration would be considered a fairly big deal by most standards: Partner Edward I. Metz will ply his talents as a jazz pianist at the president's inaugural ball later this month. $5 million and counting. This month, Microsoft will start spending money to promote its Excel spreadsheet for the PC, and it won't stop until it has blown $5 million. The effort is aimed at turning the heat up on Lotus, which has no graphical spreadsheet and _ still _ no 1-2-3 Release 3.0. Look for aggressive pricing, rebates, seminars and anything else that will help the product move out the door. Meanwhile, Microsoft plans to get a jump on the Presentation Manager market by shipping Excel/PM by the middle of the year. Look for a $50 upgrade charge. Ringing into MAP. Expect to see AT&T become a Manufacturing Automation Protocol player in the near future. AT&T R&D subsidiary Bell Laboratories is reportedly working on an interface that will run MAP's Token-Bus protocol on top of fiber-optic cable, which now has no official place in MAP. No spotlight-sharing this time. At DEC's lavish desktop rollout tomorrow, it will be K. O. solo. Recent DEC micro-related announcements have been held with Olsen introducing a micro partner CEO. Last year, it was Apple's John Sculley and Ashton-Tate's Ed Esber. Not this week; other CEOs will be as invisible as the fruits of the year-old Apple-DEC alliance. Anybody who has spotted a tangible DECapple product on a desk should phone it into the hot line (800-343-6474 or 508-879-0700) and get News Editor Pete Bartolik to roll out the news team. <<<>>> Title : A view from the top Author : CW Staff Source : CW Comm FileName: akerstap Date : Jan 16, 1989 Text: When John F. Akers became chief executive officer of IBM in 1985, the world's most profitable corporation saw a bright future ahead. The 3090 was ushering in a lucrative mainframe product cycle, the IBM Personal Computer was king of the desk top, and IBM had ambitious plans to battle AT&T for worldwide communications supremacy. Most ambitious of all, IBM said the heady goal of $100 billion in annual revenue was attainable by 1990. Within one year, however, Akers faced a much more humbling challenge _ how to restore year-to-year growth in profits. After IBM's first two consecutive years of earnings declines since the Great Depression, Akers achieved that goal in 1987 and will again in 1988 _ although profits will still be below their 1984 peak. Akers implemented some of the most sweeping internal changes in IBM's history, including a net work-force reduction, without layoffs, of about 16,000 employees since 1985. In an exclusive interview with Senior Editor Clinton Wilder and Editor in Chief Bill Laberis last month at IBM's Armonk, N.Y., corporate headquarters, Akers spoke candidly about IBM's overdue reorganization, his feelings about Japan's MIS philosophy and IBM's software and telecommunications strategies to serve distributed IS organizations. The cost cuts and redeployments of the past three years were necessitated by what would appear to be a bloated middle management structure. How did things get to that point, and what lessons has IBM learned? For 75 years _ certainly for the past 25 _ IBM has had a remarkable performance in business success. That success tends to have a management team not want to tamper with what's in place. As that success began to diminish a bit in the last three or four years, it's an awful lot easier to gain the attention of the management team and all the people. What had to happen for this enterprise is to have reality come in and hit us on the head. Reality has hit us on the head, and we say, ``What does it take to be competitive, to get growth going again, to have the return on investment that we want to have?'' But it takes that cold shower before you come to work in the morning to get your blood going. And our blood is going. In comparing the MIS market with what it was 10 to 15 years ago, is it fair to say customer loyalty isn't what it once was? Well, there are an awful lot more customers. When I grew up as a salesman with IBM in the '60s, we used to have one or two customers per enterprise. Today, there are thousands. There's a much broader constituency to try to satisfy, and there's a much broader array of competitors trying to do exactly the same thing. The race is getting swifter. At Computerworld, we see a less hierarchical distribution of power in information systems networks. How do you look at the MIS chief as this evolution is taking place? How is IBM relating to that individual or office? Well, I think that job is getting a whole lot harder. There are three or four pressures on that individual that are very tough. The top management of American industry is constantly asking whether or not they're getting their return on their investment in information systems. That office has to help answer that question, and we have to help as well. Secondly, because more people are involved in the enterprise, the job becomes not deciding what to do and doing it _ but deciding what to do and then trying to get everybody into that same boat. There's a lot of persuasion that has to go on, and that's very tough. The third major pressure, particularly in enlightened business enterprises, is that competition, for our customers, is as tough as it is for us. Information systems, appropriately applied and implemented, can be a competitive weapon. The role the IS manager should be playing is more central to the operations of the business. Absolutely. It wasn't so long ago that you had to be technically competent and deliver what you promised. Those days are gone. You have to be an advocate for and implementer of systems across enterprises that really impact the competitiveness of the enterprise. That's a big job. What challenge does that pose to IBM to serve that IS manager? How does IBM respond? In two major ways. First of all, we have created a sales and support organization around the world that increasingly understands our customers' businesses and business problems. We'd like to have the IBM team viewed as a plus in the discussion of the business problems. Technical competence, in and of itself, won't carry the day at all. The second thing is, we're spending more of our development money on solutions. There are a lot of questions about IBM being a hardware business or a mainframe business, but you'll find an increasing percentage of development resources going into solutions. The operation in the U.S., leading it on a worldwide basis under Ned Lautenbach, has very aggressive budget increases. What do you mean specifically by more development effort into solutions? Is it more vertically oriented applications? Is it mainframe-oriented? It means that if you were to walk into Ned Lautenbach's conference room, you would find him conducting meetings on what are the top opportunities for information systems, industry by industry. What are the ingredients necessary to provide that set of solutions to the marketplace? Some of them will be 370, some will be AS/400, some will be PS/2, and some will be all of the above. When SAA gets fleshed out a bit more, it will go across all three. When you make significant internal changes, as you have done, that can be painful. Do you feel that the company is over the worst of the adjustments that it needs to make internally? The vast majority of organizational and management changes are in place. You don't organizationally change a business as large as this one, as much as we have, without creating some disruption. There's been a feeling on the part of IBM people that we're on the right track _ and then schizophrenia about more change vs. ``let us absorb what's already in place.'' Were there fundamental shifts in demand or market conditions that caused IBM to reassess its approach? I don't think they were fundamental changes, other than the continuing changes toward easier-to-use, lower priced computers and a plethora of workstations and networking. But I think all of us in the industry understood that that was going on. What IBM needed to do better was to be closer and more in concert with our customers. We needed to have a better product line, and we needed to be more efficient. I meet with customers very often, all around the world, and all of them say there is a new and different IBM. That doesn't mean it's a perfect IBM; it's a new and different and improving IBM, and that's gratifying. We're a more humble IBM, with more of a desire to participate with our customers, have them come into our facilities, review our plans and programs. You have spoken out strongly in the past about IBM's need to protect its intellectual property. How do you feel, on a gut level, about the recent arbitration decision granting Fujitsu restricted access to IBM source code? Are you comfortable with it? I'm comfortable with the conclusion. What do you think is the proper role for the U.S. government in trade? What would you like to see more of from the incoming administration? The actions that have been taken by the Reagan administration have been terrific. I would say it should be a continuation of the same. In the final analysis, what the world needs is open markets and open trade. What are two or three technologies that you think will have the most impact on your largest customers, and Computerworld readers, in the next five to 10 years? I would say Systems Application Architecture is one. I'm not being facetious. If you ask the pros in computing what they'd like to do that they can't do today, what they need is productivity enhancements in software development, continuity in terms of people dealing with computers and screens, and enhanced connectivity. That is precisely what we're trying to give them. Beyond that, I would expect that some of things going in image processing [will have a large impact]. I went down to USAA Insurance in San Antonio to see what's going to happen there when they take the 25 million letters they get every year and eliminate all the paper; it's very exciting. And it's not just the people dealing with the paper that will be impacted, it's the management and executive management, who wax eloquent about the competitiveness of their enterprise. Perhaps expert systems will have a bigger impact, too. Does Wall Street adequately understand what it means to change a company as large as IBM in a rapidly changing industry? I think there is a tremendous desire for short-term performance in U.S. markets. The short-term performance of IBM in 1987 and 1988 has been satisfactory. I think what Wall Street would like to see is a continuation of what the trends have been. How do you deal with the conflict between Wall Street's desire for short-term results and a company's long-term plans, some of which may take years to bear fruit? Every company in the U.S. has that conflict. We manage the business for the long haul. We're not simply a fab-and-assembly hardware business that picks technology off the shelf. We invest in our semiconductor facilities, and those paybacks aren't for five to seven years. I think the software strategy that we're on, embodied by SAA, is long-term _ you can't do those things in a very short time. If we didn't have that strategy, we'd be sitting around in task forces trying to figure out how to put that strategy into place. It's in place now, and it's not all manana _ you can touch and feel pieces of that software strategy that are in our customers' offices. So we try to balance the need for tactical performance improvement, but practically speaking, this business has to be managed for the long term _ and it is. If you think back four or five years before you became CEO, what turned out to be different from what you expected? It's been much more challenging than I would ever have thought. The requirements for IBM to change and the changes that are ongoing are probably as significant as any in the history of the enterprise. I think that in the early 1980s, for a time, the business enjoyed enhanced success over the prior five to eight years; that tended to suggest the business was in good shape and would continue to be so. And that clearly has not been the case. So I think that the cold light of day is upon us; we understand what we have to do, and we're working hard at it. If you could put IBM's 500 largest customers in one room, what would you want to tell them about the IBM of the 1990s? I would like to tell them that we really are working every day to try to have a set of delighted customers. The IBM of yesterday, which was a bit too reserved, played its cards too close to the vest, wasn't as approachable as people would have liked, is really an IBM of yesterday. I would like to have an IBM emerge over the next several years that is sincerely approachable, that is responsive, that is open, that has a sense of humor, that doesn't take itself as seriously as perhaps it did in the past, that really is an enterprise dedicated to add value to a customer's operation. That's what I'd like to tell them: Keep watching this space. By Clinton Wilder and Bill Laberis, CW staff <<<>>> Title : Akers on Japan Author : CW Staff Source : CW Comm FileName: side1 Date : Jan 16, 1989 Text: The Japanese competitive edge usually evokes images of mass-produced CD players or memory chips. But to IBM Chairman John Akers, the edge also exists in the strategic use of computer technology. ``There's no question in the minds of Japanese management about the necessity to have effective, efficient, aggressive IS strategies,'' Akers said. ``There is less commitment in the U.S., and I think that's a problem. ``That's another pressure on the IS manager, because there's an educational requirement that's necessitated by that lack of commitment,'' Akers added. ``Non-American managements are more tuned up on this subject.'' <<<>>> Title : The telecom front Author : CW Staff Source : CW Comm FileName: side2 Date : Jan 16, 1989 Text: To most observers, IBM's telecommunications strategy for the past four years has been marked by fits and starts, with various investments in telecom providers being sold off or scaled back. Not so, says John Akers. He insisted that IBM's former 16% stake in MCI Communications Corp. was primarily a financial investment and that its recent sale of 50% of Rolm Corp. to Siemens AG does not signify less emphasis on the U.S. private branch exchange market. ``We continue to want to aggressively sell PBXs in the U.S., and I think we can,'' he said. ``Siemens brings relationships and a customer base in Europe, communications strengths that we don't have. We're putting this business together to use their strengths and ours; we're in it for the long term with them.'' <<<>>> Title : Honeywell Bull axes 1,600 Author : CW Staff Source : CW Comm FileName: 1bull Date : Jan 16, 1989 Text: BILLERICA, Mass. _ Honeywell Bull, Inc. announced Friday it will streamline its U.S. work force by some 1,600 _ 16% of the current complement. The move is an attempt to attain fighting trim for ``leaner, harder, faster competition'' in a turbulent computer industry, said Roland Pampel, Honeywell Bull president. While he declined to specify a figure, Pampel said that the reduction is expected to save the $2 billion international Bull partnership ``a lot of money: It will result in a fairly significant improvement in overall costs.'' The move may prove more effective than many other instances of work force reduction, including some at Honeywell Bull, predicted Donald Bellomy, an analyst at International Data Corp., a market research firm in Framingham, Mass. ``They're not just protecting the bottom line, although there is that aspect; they're also focusing on redundancies and inefficiences that really do exist,'' Bellomy said. The partnership of Groupe Bull, Honeywell, Inc. and NEC Corp. necessarily left Honeywell Bull with some ``serious disjunctions'' that will be beneficially eliminated, Bellomy added. The employee cuts, Pampel said, in no way signify a change in Honeywell Bull strategy; rather, they are aimed at creating a company in condition to pursue strategies already in place. Manufacturing and administration will take the hardest hits; ``we're not touching sales,'' Pampel said. Details with regard to geographical impact will not be known until the company determines how many of the approximately 1,000 employees now eligible for early retirement will take advantage of what Pampel described as a ``fairly generous early retirement plan.'' ``So much of where it hits will depend on the outcome of the early retirement plan,'' Pampel said. Whatever the outcome, though, Honeywell Bull will continue to vigorously pursue research and development activities at both of its current major R&D locations here and in Phoenix, Pampel said. Of the employees eligible for early retirement, approximately 650 are located in the Boston area; 250 now work at the Phoenix plant. Charles P. White, program director of industry service at the Gartner Group, Inc. in Stamford, Conn., saw last week's action as ``the precursor of a general [Honeywell Bull] reorganization that I've been expecting for some time.'' White predicted that the company will further move to place its European operations under a single marketing group in preparation for the 1992 formalization of the European Economic Community trade alliance. Also, he said Bull might reduce headquarters staffing in pursuit of the distributed management, or decentralized decision making, that many market analysts regard as a hallmark of a survivor in the data processing arena. The decisive tone and content of last week's announcement, Bellomy said, also signaled Pampel's recognition that Honeywell Bull has an image problem to overcome before it can become the computer market contender it wishes to be. ``He's beginning to put his mark on the company,'' White said. As predicted, Honeywell Bull announced earlier last week that French co-owner Groupe Bull had acquired 22.6% of the shares formerly owned by Minneapolis-based Honeywell, Inc., leaving Bull as majority owner with 65.1%, Honeywell with 19.2% and Japan's NEC with 15%. Pampel was emphatic in stating that the latter layoff announcement had ``absolutely nothing to do with'' the long-planned ownership shift. ``As a matter of fact,'' he added, ``Bull itself is going through some of the same kind of streamlining.'' By Nell Margolis, CW staff <<<>>> Title : Alliant cuts one-fifth of Author : CW Staff Source : CW Comm FileName: calliant Date : Jan 16, 1989 Text: LITTLETON, Mass. _ Alliant Computer Systems Corp. stripped down for action in the increasingly competitive minisupercomputer market last week, paring the company of 75 employees _ almost 20% of its work force. Alliant said it expects the restructuring to decrease the company's annual costs by approximately $20 million. This, Alliant hopes, will thereby speed the company's return to the profitability that has eluded it for the past two quarters and is likely to prove missing when results are announced for the fourth quarter ended Dec. 31, according to President Ronald Gruner. It will also generate a ``material'' one-time fourth-quarter charge, Gruner said. ``The bare fact of the restructuring isn't likely to surprise many people. The magnitude of the write-off might,'' said Jeffry Canin, an analyst at Hambrecht & Quist, Inc. While Alliant declined to quantify the impending write-off, Canin speculated that it could be in the $10 million to $15 million range. Need to absorb Canin called the move ``more reactive than strategic'' on Alliant's part. An early entry in the minisuper market, Alliant has been faced with the need during the past several months to absorb recently acquired subsidiary Raster Technologies, Inc. while battling a barrage of fierce competition, said Theresa Liu, an analyst at Montgomery Securities in San Francisco. The purchase ``expanded Alliant's product line but perhaps diffused its corporate focus,'' Liu noted. Workstation announcements anticipated from Digital Equipment Corp. (see story page 1) are expected to further intensify the competitive pressure. Gruner, however, appeared sanguine with regard to the effect on Alliant of a DEC onslaught. ``Truly, the competition for us _ implicit now, explicit in six months _ is Digital,'' he said. ``There's a fair amount of pent-up business out there, waiting to see what Digital brings out.'' When they see, Gruner said, ``we think that a lot of them will decide on Alliant.'' By Nell Margolis, CW staff <<<>>> Title : MAI hints at sweeter offe Author : CW Staff Source : CW Comm FileName: lebow Date : Jan 16, 1989 Text: TUSTIN, Calif. _ MAI Basic Four, Inc. last week turned up the heat in its hostile bid for Prime Computer, Inc., indicating a willingness to sweeten its $20-per-share offer. Prime viewed the sugar as thinly coating an unpalatable proposition, and Prime users remained unruffled, even in the face of MAI's recent claim that more than 50% of Prime's shares have already been tendered. In a letter addressed to Prime Chairman David Dunn and Chief Executive Officer Anthony Craig, MAI Chairman Bennett LeBow urged the cost savings of a negotiated transaction and suggested that MAI ownership might be inevitable. Anything but, retorted Prime. In a letter to LeBow, Dunn called ``absolutely untrue'' MAI's contention that Prime's experience ``must lead you to believe that there is no serious alternative bidder for Prime'' and invited MAI to withdraw its bid. Citing a litany of allegations of wrongdoing against LeBow and partner William Weksel, as well as major financier/co-bidder Drexel Burnham Lambert, Inc.'s recent agreement ``to plead guilty to multiple felonies, several of which involved the fraudulent use of inside information,'' Dunn concluded that ``we are skeptical that Drexel, Weksel and yourself would treat Prime and its stockholders any better than you have treated your customers and/or stockholders in previous situations.'' Big eyes, big stomach Early derided as all but incredible, the small California company's attempt to swallow up the far larger Massachusetts concern made a huge stride toward credibility with its year-end claim that holders of 24.2 million of Prime's shares have already accepted its offer. Added to the 1.9 million Prime shares already controlled by MAI and Drexel, this gives the would-be acquisitor more than half of Prime's outstanding shares. Nevertheless, last week's letter did not appear to be written from a position of strength, according to Stephen Dube, an analyst at Shearson Lehman Hutton, Inc. The coy ``offer to make an offer'' nature of the correspondence, he said, cast doubt on MAI's confidence that it could amass the 85% of outstanding Prime stock mandated under applicable Delaware law to achieve a successful hostile acquisition. In addition, Dube said, Prime's recent reorganization, consisting of a realignment and consolidation of business units as well as an approximately 10% work-force reduction, is expected to save the company approximately $50 million in 1989, making Prime worth more than it was when MAI priced its original bid. Prime stock, which has been inching upward, closed Thursday at 18 . ``If Prime can get over $20 without opening their mouths, this offer could be moot,'' Dube noted. Prime users do not appear greatly disturbed by MAI's actions. ``I can't say I'm excited about the prospect,'' said Jim Tunis, president of Lincoln National Information Service, Inc., a wholly owned subsidiary of Fort Wayne, Ind.-based Lincoln National Corp., which owns about 18 Prime systems. Big enough? ``I was a little concerned when MAI first made its bid,'' said James Gaspers, MIS coordinator for the city of Scotts Bluff, Neb. ``I still am. I'm not sure that a company the size of MAI can give us the level of support we're used to, and I'm not sure they'll have the wherewithal to keep the company alive.'' Sharing the latter concern is Massachusetts Federal District Judge A. David Mazzone. Late last year, Mazzone enjoined MAI's tender offer pending further disclosure with regard to the deal's financial framework. This decision was made particularly in light of Drexel Burnham's recent decision to plead guilty to six felony counts of mail, wire and securities fraud and a decision by the Delaware Chancery Court that upheld Prime's ``poison-pill'' provisions. By Nell Margolis, CW staff <<<>>> Title : DEC bids for desk top _ a Author : CW Staff Source : CW Comm FileName: 1deckedo Date : Jan 16, 1989 Text: MAYNARD, Mass. _ In terms of reaching the desktop market, it's crunch time for DEC. Digital Equipment Corp. will begin at least its third attempt to win the hearts and dollars of desktop users tomorrow when it unleashes an array of hardware and software products designed to quickly and cleanly make it a dominant player in the flourishing desktop market. The major hardware components of what DEC officials called the largest introduction in the firm's 31-year history _ the extravaganza will fill a building in nearby Littleton, Mass., that has been dubbed Dectop University for the occasion _ will range from personal computers to workstations capable of processing about 10 million instructions per second. In one fell swoop, DEC hopes to blunt the expansion of competitors like Sun Microsystems, Inc. that have carved out significant market share. Additionally, DEC faces the slippery marketing task of selling to users who still remember its unsuccessful 1982 desktop rollout, at which the company introduced three incompatible systems: Decmate, Rainbow and Professional. The 1985 Vaxmate introduction was also nothing short of a sales disaster. The machine that analysts claimed will be most likely to put sweat on DEC's competitors' brows is code-named PMAX. The machine is a high-performance technical workstation based on Mips Computer Systems, Inc.'s R2000 reduced instruction set computing microprocessor, making it DEC's first significant product since Rainbow to use a non-DEC CPU. The PMAX is expected to process 10 ``VAX units of performance,'' or VUP (a VUP equals approximately one MIPS), making it a challenger to Sun's Sun-4/260 workstation. The machine could also significantly undercut the Sun-4/260's price tag, which has a base price of nearly $40,000. Terry Shannon, director of Framingham, Mass.-based International Data Corp.'s (IDC) DEC Advisory Service, said, ``it would not surprise me to see DEC offer RISC MIPS at $1,000 a piece. Halt the exodus Steve Blank, a cofounder of Mips Computer Systems who now heads the marketing department at Ardent Computer Corp., said DEC's inclusion of the R2000 could bite into the market share of workstation makers such as Prime Computer, Inc. and Silicon Graphics, Inc. Blank said these vendors have been ``fighting over the dead bodies of Microvaxes and VAXs. DEC customers could put up with DEC's lousy performance for only so long, then they looked elsewhere. I think this could stop that sort of attrition.'' The PMAX will run only DEC's Ultrix operating system, rather than its VMS. Another key hardware cog will be a relabeled Tandy Corp. personal computer that is likely to use Microsoft Corp.'s MS-DOS operating system and Intel Corp.'s 80286 and 80386 microprocessors. The 80386-based model will also support OS/2. The release of this PC will also signal the eventual phasing out of the Vaxmate and Vaxstation 2000, a DEC official indicated. ``Over time, both will be replaced by the new products,'' said Pete Smith, vice-president of DEC product marketing, at a recent briefing. Some analysts predicted the phaseout could begin this fall. Heading the list of internally developed products will be the PVAX, a low-end commercial workstation based on DEC's CVAX processor that will run approximately 2.4 VUPs. The VMS and an Ultrix-based machine will be the first DEC workstation to incorporate 3 -in. peripherals and is expected to compete with the Sun-3/60 workstation. Also arriving will be the Firefox machine, DEC's second-generation mid-range workstation. The two-processor Vaxstation 3520 and four-processor Vaxstation 3540 will offer approximately 5 and 10 VUPs, respectively. While it is expected to be a VMS-only machine at announcement time, Ultrix symmetrical multiprocessing support is likely to be forthcoming. A diskless model is also expected. DEC will also promote a series of software products intended to tie into an announced connectivity scheme called Applications Integration Architecture, an umbrella name for adjuncts to heterogeneous systems, and Decwindows (see story above). All the products are scheduled to be available within three months and are predicted to favorably impact the third and fourth quarters of DEC's fiscal year. Although the amount of product introductions could take users aback, some analysts said they think the scope is appropriate. ``Once a customer goes to another supplier to buy something, you never know what their next move will be,'' said Jay Stevens, an analyst at Dean Witter Reynolds, an investment firm in New York. ``They should have done [the announcement] years ago.'' Another burning question DEC must face is whether the products come too late to sway firmly entrenched workstation users. Both Sun and Apollo already enjoy significant market share and customer loyalty. ``I think DEC is wielding a sharp sword here, but it better be careful it doesn't cut off its own foot,'' said Bob Hambrecht, a senior technology analyst at Hambrecht & Quist. ``You have to be very careful when you start competing with someone by copying him.'' But even entrenched products can be dug up. ``If they're good products, we don't care how late they are getting to market,'' said Dean Allen, vice-president for information services at Lockheed Corp. in Calabasas, Calif. ``Right now, we've got all types of workstations, but if the new DEC machines give us the kind of price/performance we're looking for and we can put it into our environment transparently, we're going to do that.'' A further curve could be thrown DEC's way by Sun and IBM. Sun is due out with a low-end system, based on its Scalable Processor Architecture, that will clock in at about 7 MIPS and could cost less than $10,000, said Vicki Brown, director of systems research at IDC, while rumors have been circulating that IBM will release a high-performance version of its Personal Computer RT workstation. Senior Editor Rosemary Hamilton contributed to this report. By James Daly, CW staff <<<>>> Title : DEC's game plan Author : Stanley Gibson Source : CW Comm FileName: decsoft2 Date : Jan 16, 1989 Text: In a briefing last week, DEC officials offered some insights into various software strategies, including the following: The announcement tomorrow includes an MS-DOS compatibility software package for VMS intended mainly for the Vaxstation 3100 (PVAX). The package will allow MS-DOS to run under VMS in multitasking mode. DEC will ``program-announce'' All-In-1 with Decwindows. That is, it will not announce pricing and availability right away but will say that work is being done to build an All-In-1 version with a Decwindows interface. DEC is changing its attitude about Unix. The new approach is to ``just say yes. If people ask for Unix, we say yes,'' said Don McInnis, DEC vice-president. Previously, DEC pushed VMS. Symmetrical multiprocessing will not be announced with Ultrix next week. How soon DEC will announce symmetrical multiprocessing with Ultrix ``depends somewhat on OSF developments,'' McInnis said, although he claimed DEC's strategy is not tied to what the OSF does. Over time, DEC will bring Decwindows to an IBM Presentation Manager look and feel, following the OSF's lead. OS/2 will not be offered right away with the PC products. That would be contingent on bringing Decwindows to OS/2. MS-DOS 3.3, rather than OS/2, will be sold with the DEC PCs. The unsophisticated user will be able to bring up VMS on a workstation and manage it using icons. A version of VMS with Posix compliance in kernel mode is reportedly coming, although no more specifics were offered. STANLEY GIBSON <<<>>> Title : Decwindows to cement inte Author : CW Staff Source : CW Comm FileName: decsoft Date : Jan 16, 1989 Text: Digital Equipment Corp's massive desktop announcement tomorrow will be the coming-out party for Decwindows, the cornerstone of an integration strategy that DEC hopes will carry it to the No. 1 position in the workstation market. Decwindows is part of a larger connectivity concept under which DEC hopes to glue together disparate operating systems in a seamless environment. The concept is analogous to IBM's Systems Application Architecture (SAA). It was unclear last week when DEC plans to unveil the full connectivity scheme, known as Applications Integration Architecture (AIA). ``AIA is an antidote to SAA. Its advantage is that it includes Unix,'' said Terry Shannon, director of the DEC Advisory Service at International Data Corp., a market research firm in Framingham, Mass. IBM has made it clear that its SAA and Unix offerings will remain separate. ``The Decwindows-AIA strategy will enable DEC to sell into new accounts by integrating non-DEC personal computers and workstations. In such accounts, DEC will own the network, and the vendor who owns the network controls the account,'' Shannon said, summarizing DEC's strategy. Even though Decwindows quietly began shipping in December with DEC Ultrix 32 Version 3.0, DEC will make its formal announcement of the offering tomorrow, complete with pricing and availability. Triple whammy DEC will add Decwindows to VMS Version 5 in a new release and will announce Decwindows for Microsoft Corp. MS-DOS, which will be offered with DEC's Tandy Corp.-made PCs. Thus, Decwindows will run on all three operating systems for which DEC is announcing hardware: MS-DOS, Ultrix and VMS. But while DEC is embracing MS-DOS and Unix on the desk top, Apple Computer, Inc.'s Macintosh, which apparently had a key role in DEC's desktop strategy last year, is going to be absent from tomorrow's festivities, DEC officials confirmed. One reason for its absence could be that Macintosh windowing is different from Decwindows, although Shannon claimed that DEC engineers have had Decwindows running on a Macintosh for six months. At the announcement, DEC will showcase some 80 developers who have been working on applications that use Decwindows. About 20 applications will be ready to ship immediately, DEC Vice-President Peter Smith said. Decwindows will also reflect the glory of the Open Software Foundation's recent selection of the Decwindows tool kit for development in its Graphical User Interface for its OSF 1. By Stanley Gibson, CW staff <<<>>> Title : News shorts Author : CW Staff Source : CW Comm FileName: short19 Date : Jan 16, 1989 Text: Arthur Andersen restructures Arthur Andersen & Co.'s 2,200 partners approved a restructuring plan last week that places the firm's consulting practice on an equal footing with its traditional tax and audit practices. Starting Sept. 1, Andersen will have two reporting structures to the partnership's chief executive officer _ one for consulting and another for the tax and auditing groups. A new chief executive officer will also be elected, replacing current CEO Duane Kullberg, who has held that post since 1980. The moves came in response to pressure from the consulting partners, seven of whom left late last year to start their own practice. DEC signs with Apollo Following in the footsteps of IBM and Hewlett-Packard Co., Digital Equipment Corp. has announced plans to license Apollo Computer Inc.'s Network Computing System (NCS) for possible use in future products. NCS, which Apollo has proposed as part of the Open Software Foundation's software environment, is said to provide tools for distributing applications across a multivendor computing network. Apollo, meanwhile, has announced the immediate availability of NCS versions for DEC's VMS and Ultrix and for Sun Microsystems Inc.'s SunOS operating systems. Comten boosts processor NCR Comten, Inc. is expected this week to announce an enhancement that reportedly boosts the Comten 5660 communications processor's total throughput to 80% greater than that of IBM's 3745. However, the firm said that the comparison only refers to IBM's single-processor 3745 Model 210 and does not refer to the dual-processor version. The High Performance Feature board is available now and is priced at $60,000. NCR Comten said it will also reduce the 5660's base price by $125,000 to $175,000. Dataproducts not looking to sell Printer products supplier Dataproducts Corp. last week issued a statement indicating that it has responded to a takeover inquiry with a letter indicating that the board of directors ``is not seeking to sell the company at this time.'' The company received a letter Dec. 12 seeking a meeting for an acquisition proposal from a representative of a New York investment group holding 5.2% of the company's stock. Dataproducts said last week any acquisition proposal should be submitted to the board in writing. PC vendor woes Part I Personal computer price-cutting and competition took a heavy toll on vendors last week. AST Research, Inc. announced it will report a loss for the second quarter ended Dec. 31 because of ``the action of competitors who have announced price reductions, particularly on maturing systems.'' But about $6 million of the $7 million to $9 million loss could result from a charge for the potential sale of AST's Camintonn division, which the company acquired in 1986 for its DEC add-on products. PC vendor woes Part II Also troubled in the PC arena is Wyse Technology, which last week announced an impending quarterly loss, the departure of its president, a 15% work-force reduction and a revenue plunge of 50% from year-earlier levels. President and Chief Operating Officer Phillip E. White, who joined Wyse from IBM in 1986, left to become CEO of database software vendor Informix Corp. Wyse will reportedly cut 560 jobs; 400 of those cuts are expected to come in Far East manufacturing. In addition, the San Jose, Calif., personal computer and terminal maker said it sought and received waivers from certain financial obligations to its lenders. <<<>>> Title : Unix goes to entry level Author : CW Staff Source : CW Comm FileName: dgintro Date : Jan 16, 1989 Text: WESTBORO, Mass. _ Data General Corp. furthered its commitment to a Unix product line last week with the introduction of an entry-level, multiuser Unix system based on its Dasher/386 personal computer. The Dasher/386 Unix platform will incorporate a 16-MHz Intel Corp. 80386 microprocessor, is compatible with the IBM Personal Computer AT and uses the 386/IX product set developed by Interactive Systems Corp. The 386/IX product set is based on the AT&T-certified Unix System V/386 Revision 3.0. The system features two of the University of California at Berkeley Unix 4.2 facilities: the C shell, which is a command-langauge interpreter; and Sendmail, a general-purpose, internetwork mail-routing facility. The Dasher/386 Unix system supports up to 16M bytes of systems memory in 1M- or 4M-byte increments and offers up to 318M bytes of internal disk storage. Also provided will be intelligent eight-line serial controllers, which allow up to 25 asynchronous serial connections. The Dasher/386 Unix system with 40M bytes of disk storage, 4M bytes of memory and a 386/IX runtime operating license is priced at $7,320. The same system with 70M bytes of disk storage costs $8,020. The products reportedly will be available next month. In a related announcement, DG made public the signing of an agreement with Framingham, Mass.-based Language Processors, Inc. (LPI). Under the terms of the pact, DG will offer LPI's 32-bit compilers on the Dasher/386. By James Daly, CW staff <<<>>> Title : GM to lift fiber MAP ban Author : CW Staff Source : CW Comm FileName: fibergm Date : Jan 16, 1989 Text: DETROIT _ General Motors Corp. has finally announced plans to embrace fiber-optic cabling for the factory floor. But this about-face comes several months after the MAP 3.0 standard was frozen in place. As a result, users are likely to face a lengthy wait for fiber-based Manufacturing Automation Protocol networking products. ``We recognize that fiber will be part of the action,'' said Jack Eichler, GM's director of advanced manufacturing engineering. ``We will install it when it becomes part of the MAP standard.'' That may not be anytime soon because right now, Eichler said, ``the connection technology is different depending on which vendor you go to.'' But sources both inside and outside of GM complained that a solid, consistent fiber-optic MAP standard would be available now if the top people in GM's MAP effort had supported the effort. Under GM's leadership last summmer, the MAP/TOP Steering Committee finalized MAP 3.0 for the next six years, with fiber-optic cable relegated to an equivocal position in the standard's appendix. MAP barriers Until such time as fiber becomes part of the main standard, GM has officially barred the medium from its factory floors, according to Ali Bahroloomi, chairman of GM's MAP Task Force and manager of communications networks at GM's Buick Oldsmobile Cadillac division. Internally at GM, broadband and carrierband are currently the media that should be implemented into manufacturing and industrial applications, Bahroloomi said. GM spokesmen admitted that the fiber-optic medium has advantages that are uniquely suited to factory networking applications. For example, it has far greater bandwidth capacity and greater resistance to electrical interference and other environmental factors than coaxial cable. But fiber has gotten no encouragement from GM's MAP leaders in the past, according to Orest Storoschuk, an engineer at GM's Cadillac Pontiac Canada division who has championed fiber-based networking at GM for the past decade. In particular, Michael Kaminski, former MAP project manager at GM, has been consistently pushing broadband cabling over fiber-optics, Storoschuk said. ``It's fair to say that the priority [of MAP leaders at GM] has never been to put fiber into MAP but to get the basic MAP specification in place,'' said an industry consultant who requested anonymity. ``I think the fiber standards effort could have gone faster without compromising this goal.'' Group approval A fiber-based version of the MAP 802.4 Token-Bus protocol gained approval from the Institute of Electrical and Electronics Engineers, Inc. (IEEE) this fall, which ``in LANs, is normally tantamount to [becoming] a standard,'' said Robert Crowder, president of Newark, Del.-based manufacturing consulting company Shipstar Associates, Inc. Final approval from the International Standards Organization is expected in May, Crowder added. However, ``IEEE approval does not mean it has to be part of the MAP specification; they recommend a lot of things,'' Bahroloomi said. The 802.4H specification is now part of the MAP 3.0 appendix, but vendors are not required to include appendix items in their products in order to be MAP-compliant, Crowder said. ``Shipstar can already build a fiber-based MAP network now'' based on available vendor products, but an appendix ``does not carry the same level of endorsement of fiber-optic technology'' that would encourage other vendors to enter the market, he added. While MAP leaders at GM were officially banning fiber from the company's factory floors, the Chevrolet Pontiac division was quietly installing the forbidden medium on its local-area networks _ including three fiber-based MAP networks that are currently in the works. The networks employ MAP controller cards from Ungermann-Bass, Inc. and fiber-optic modems from Thomas & Betts Corp. in Bridgewater, N.J. But Chevrolet Pontiac's fiber-based MAP projects are definitely the exception that proves GM's no-fiber rule, according to Storoschuk. The division started using fiber before MAP was defined, in 1979. Then when the standard came along, GM Canada received permission to implement it on top of its existing cable. ``If Oshawa asked to implement fiber now, no one would let them do that,'' Bahroloomi said. Right now, GM is working with vendors to ensure interoperability of fiber-based MAP products. It is also ``in the process of testing [fiber-optic cabling technology's] pros, cons, strengths and weaknesses,'' Eichler said. By Elisabeth Horwitt, CW staff <<<>>> Title : Lawyers fret over risks o Author : CW Staff Source : CW Comm FileName: legal Date : Jan 23, 1989 Text: WASHINGTON, D.C. _ Someday, somewhere, an electronic data interchange (EDI) transaction is going to go awry, and the hostile finger-pointing is going to land the parties in court. The risks in EDI transactions include transmission errors, faulty data, failed communication, unauthorized disclosure to third parties, interception during transmission, late or delayed transmission and transmission to the wrong parties, according to a report by an American Bar Association's (ABA) task force. That is why several industry groups are urging EDI business partners and vendors to consider the legal implications of EDI transactions and prepare for that inevitable day in court. ``EDI replaces paper documents _ a medium that enjoys a long history of support under the law as a carrier of legal information _ with a new, electronic medium, the status of which is not well defined in the law,'' according to a monograph written by Dallas attorney Benjamin Wright and published by TDCC:The Electronic Data Interchange Association (TDCC/EDIA) in Arlington, Va. For example, EDI transactions may not be enforceable as contracts under the U.S. statute of frauds because they are not ``written'' and ``signed,'' legal experts said. In addition, there are questions about who is liable for transmission errors. The topic is getting increased attention from associations, standards committees and vendors because the legal uncertainties could stifle the fast-growing EDI market. At the TDCC/EDIA's December conference here, a seminar on legal issues was so crowded it had standing-room only. ``We're not trying to scare anybody,'' said attorney and consultant Michael S. Baum, ``but EDI uses technology to form a business contract. That has a high level of legal content, so it deserves commensurate attention to the legal implications.'' Baum is president of Independent Monitoring, a consulting firm in Cambridge, Mass., as well as chairman of the ABA's Electronic Messaging Services Task Force. He also serves as chairman of the ANSI X.12 Committee's newly formed Legal Issues Task Group. Among the participants in the X.12 task group are Bank of America National Trust & Savings Association, LTV Steel Co., Mobil Oil Corp., Shell Oil Co., DuPont Co. and Electronic Data Systems Corp. Baum suggested that MIS managers educate corporate attorneys about EDI technology, monitor legal developments and review internal procedures and trading agreements with an interdisciplinary team of technical, managerial and legal experts. The prospect of EDI-related lawsuits will increase as the EDI trading universe grows to include less-sophisticated users and companies that may have financial problems, Baum said at the TDCC/EDIA conference. A classic case J. T. Westermeier, a partner at the Washington, D.C., law firm Abrams, Westermeier & Goldberg, said EDI is a classic case of technology advancing faster than the law or courts can keep up. So far, there is no case law providing guidelines for how to structure legally enforceable EDI transactions. To compensate for the fact that EDI transactions are paperless, some EDI users have negotiated written ``trading-partner agreements'' to preauthorize the EDI transaction and set terms and conditions. The ABA task force is starting the difficult task of developing a model trading-partner agreement that provides a minimum level of protection and fairness, Baum said, adding that it could be tailored to meet the needs of specific industries. An important part of the trading-partner agreement is to spell out who is liable when something goes wrong. But apportioning liability _ among the trading partners, value-added networks and software vendors _ gets more difficult as the transactions get more complex and the number of third parties increases, the ABA's report said. Under the provisions of negligence law, errors in business transactions must be fixed quickly after they are discovered or the liability goes up dramatically, Westermeier pointed out. Consequently, it may not be wise to leave EDI systems running unattended, he said. Wright, Baum and Westermeier cited other legal issues that create uncertainty in the EDI marketplace. One is that EDI transactions may not be compatible with certain government regulations such as those that require paper forms and written signatures. The ANSI X.12 Government Project Team is working on this problem. Another is that EDI could raise antitrust concerns if an industry's EDI standards or the cost of EDI systems creates an unreasonable barrier for small companies trying to break into an industry. By Mitch Betts, CW staff <<<>>> Title : Taking the hard road to t Author : George Schussel Source : CW Comm FileName: schussel Date : Jan 23, 1989 Text: Despite IBM's prodigious resources, it will be harder for the company to provide distributed database capabilities across its systems than for any other vendor. IBM is committed to a distributed database development program that conforms to Systems Application Architecture and SQL and involves four different products. Almost all of IBM's competitors will be taking a single database management system product and distributing it over diverse operating systems _ IBM's and others such as Unix and Digital Equipment Corp.'s VMS. IBM understands well that in the 1990s, mainframes will become repositories and network servers for a large variety of midsize and small machines, at which most of the processing will occur. Cooperative processing and distributed databases will be the technologies that allow this new scheme to succeed. IBM's plan for providing distributed database capability is based on a multiphased approach. There are three steps yet to be accomplished. They involve a remote unit of work, a distributed unit of work and a distributed request. Stepping-stones In step one, an application may send discrete units of work to different remote databases. However, each unit must go to only one physically remote database. This requirement is loosened somewhat in step two, at which each committable unit of work may consist of a number of discrete SQL statements, and each one of those SQL statements is then required to go to a single physical site. In the third step, the restraints of physical location are removed and individual SQL statements may support physical execution on data that is located at diverse sites. In terms of currently accepted definitions in the industry, it is only this final phase that can be considered a true distributed database environment. IBM's plans call for delivery of phase three capabilities in 1992 or 1993. An essential ingredient in this plan is determining which IBM database systems will play. Four principal development laboratories are currently participating, including Toronto (SQL/DS), Santa Teresa (DB2), Rochester (Application System/400 SQL) and Austin (OS/2 Extended Edition). Currently announced future product plans from the IBM laboratories only support ``like-like'' environments. What this distinction means, for example, is that the DB2 remote unit of work (step one) capability that is to be delivered in late 1989 will only work with diverse DB2 partners. SQL/DS and OS/2 Extended cannot play _ yet. At some point, distributed capabilities will be supported over unlike partners, but this feature adds a significant level of complexity, especially because of IBM's strategy designating different relational database engines. Although IBM developers share research and product development plans among different groups, they do not share source code for the DBMS engines. The result of this policy is that even for the two different SQL DBMS engines that run on IBM 370 systems (SQL/DS and DB2), there are significant differences _ different return error codes and different handling of nulls. Santa Teresa staffers argue that differences among the SAA operating systems mean each must have its own physical implementation of SQL to operate efficiently. This is true for implemention-specific functions, such as cross-memory services and memory management. However, since the essence of distributed database services is distributed query management, IBM's task would appear to be much simpler if it could interface the same DBMS on different operating systems. Another reason for the separate product/separate operating system policy is IBM's management and accounting policies. IBM products have to stand on their own for profitability analyses. If two groups are building DBMS for two different environments, and one sends its query management source code to the other, then there must a cross-subsidy agreement between the two. IBM doesn't seem to want to tackle this problem. The ultimate success of IBM's distributed database strategy is tough to forecast. A number of mainframe competitors, such as Computer Associates, Oracle and Relational Technology, appear to be ready to offer distributed database capabilities to their customers years before IBM's products are delivered. The real issue here is going to be how technically successful these distributed DBMS products will be when they live in and support the MVS environment. By George Schussel; Schussel, president of Digital Consulting, Inc. in Andover, Mass., is a lecturer and futurist. He chairs The Database Cooperative Processing Symposium, Software Futures, 4th and 5th Generation Data Management Software and Unix Futures conferences. <<<>>> Title : The new limit Author : CW Staff Source : CW Comm FileName: edit116 Date : Jan 23, 1989 Text: TWO YEARS AGO, a leading market research company predicted that desktop computers would operate at $1,000 per 1 MIPS by 1991. Now, we all remember other predictions for the videotex market, the home computer market, etc. And often it seems to be the rule that long-range predictions are grossly optimistic. But last week DEC apparently broke that performance/price barrier with the introduction of its reduced instruction set computing-based Decstation 3100, which headlined what the company termed its most significant set of product announcements ever. DEC's aggressive workstation thrust will touch off the most ardent competition within the computer industry since the PC clone makers began battling a few years ago. This seems to be one struggle in which the customer base will almost assuredly emerge a winner. For one thing, prices are going to fall _ perhaps hard _ across the workstation spectrum. As margins tumble, vendors tend to become almost solicitous toward their customers, like car dealers during the turn of the model year. Further, increased demand stimulated by cheaper hardware invariably coaxes that much more software development from the third-party vendors _ in this case, the Unix developers. No wonder the research companies are calling for 70%-plus growth in workstation sales this year. That's one prediction users can bank on. <<<>>> Title : Name-dropping Author : CW Staff Source : CW Comm FileName: edit116a Date : Jan 23, 1989 Text: What do the following people have in common: John Akers, Les Alberthal, Robert Allen, Jack Berdy, W. Michael Blumenthal, Gordon Campbell, John Cullinane, Jack Davis, Charles Exley, William Foster, John Frank, Bill Gates, James Goodnight, John Imlay, Jerry Junkins, R. James Macaleer, David Martin, Jim Manzi, Scott McNealy, Gary Morgenthaler, Tom Nies, Ken Olsen, William Patton, Ken Pontikes, Ryal Poppa, Michael Potter, Robert Potter, Robert Price, John Roach, Ben Rosen, Lawrence Schoenberg, George Shaheen, Roger Sippl, Jim Treybig, Thomas Vanderslice, An Wang, Charles Wang, Phillip White and John Young? Besides being a Who's Who of the computer industry, these corporate leaders make up the Chairman's Committee, which will submit nominations for Computerworld's Awards for Innovative Application of Information Technology. Hatched in association with the Smithsonian Institution, Computerworld's awards will be presented in June to members of the information systems community who have achieved oustanding progress for society through visionary applications of information technology. We wish to express gratitude to these individuals, who give their time and energy to seeking out the many quiet heroes of the IS world. <<<>>> Title : DB2 battle is over Author : Fabian Pascal Source : CW Comm FileName: pasclet Date : Jan 23, 1989 Text: Frank Sweet's article on the future of databases [CW, Nov. 28] is mostly superficial or simply wrong. He sees a battle still raging between DB2 and old DBMSs such as IDBMS and Adabas that has long been over. He calls Oracle and Ingres ``small newcomers.'' He criticizes the SQL database language for not being a development language, but that, for various reasons, was the precise intention. He claims that SQL has ``so many different ways of accomplishing the same thing that it is hard for one person to understand another's program.'' Really? SQL statements per se are not programs _ that's the whole point. He may be right that SQL ``is unlikely to be popular among programmers,'' but only if they continue to ignore the relational approach behind SQL, seeing the language just as a portability and connectivity standard for traditional applications. Indiscriminate application of procedural thinking and unwillingness to distinguish between the physical and logical levels does indeed defeat major practical objectives in SQL, any hope for fixing its deficiencies and, thus, acceptance. It is terribly misleading to claim that ``relational once meant lacking explicit interrecord relationships'' and it is absolutely untrue that ``today it means enabling them to exist.'' Fabian Pascal Washington, D.C. <<<>>> Title : Kissing IS goodbye Author : Gus A. Galatiano Source : CW Comm FileName: gallet Date : Jan 23, 1989 Text: I read with interest Michael Alexander's article entitled ``Don't kiss IS guy goodbye'' [CW, Nov. 28]. I feel, however, that it would have been much more realistic if the title had been ``Don't kiss IS guy goodbye yet.'' No one can dispute the fact that for the last 10 years or so a rather significant force has transformed traditional centralized computing into distributed computing. This trend, which led to the advent of personal computers, workstations and LANs, was really the result of general dissatisfaction among users for the services they were getting over the years from the ``high priests'' of MIS and the data center. I agree that there will be a need to protect these enthusiastic users from hurting themselves; a sort of internal technology advisory group may be necessary to coach them. But, that's all that will eventually remain from the bloated hoards of MIS and data center operations staff of the past. Centralized computing in the form of powerful supercomputers will soon have a place only in certain CPU-intensive environments like R&D. Gus A. Galatianos President Advanced Computer Consulting International Whitestone, N.Y. <<<>>> Title : Ashton-Tate loyalty Author : Bill Komanetsky Source : CW Comm FileName: komanlet Date : Jan 23, 1989 Text: In your article ``Ashton-Tate sues Fox over copyright'' [CW, Nov. 28], you stated, ``Equally ironic is Ashton-Tate's own product strategy, which has borrowed liberally from the work of others. Multimate, the word processing program Ashton-Tate acquired in 1986, was clearly patterned after a Wang Laboratories, Inc. word processor.'' Ashton-Tate did not borrow anything from the Wang word processor. The Multimate product does look as if it were built based on Wang's powerful word processor, but Ashton-Tate didn't do the construction. A company aptly named Multimate International designed the program. Ashton-Tate bought the Multimate International company in 1986, long after the product was in the public's hands. I am not a computer user who gives my sole loyalty to any piece of software easily. However, Ashton-Tate has become a very good supplier of PC software, and I think it deserves the benefit of the doubt in this case. Bill Komanetsky IBM Tampa, Fla. <<<>>> Title : No VSE aftertaste Author : Jeffrey E. Smith Source : CW Comm FileName: smithlet Date : Jan 23, 1989 Text: Regarding the article ``VSE users upbeat'' [CW, Dec. 5] _ what bitter aftertaste? The only aftertaste that might be bitter is from the lack of press coverage on VSE and related products. As IBM's most widely used operating system, VSE deserves a lot more coverage. The features that you claim users have ``long coveted'' have been coming forth in a steady stream for the past four years. There are still many more features that VSE users want, like additional address spaces, ACF/VTAM in a private address space and support for more than 16M bytes of real storage. Once given the resources and priorities like all DP shops have, you can probably rest assured that these items will come to fruition. Perhaps when trade journals stop viewing VSE as ``ambiguous,'' they will finally realize that IBM's development efforts toward VSE are not being rekindled; they've been blazing all along. Jeffrey E. Smith Technical Support Manager Saint Luke's Hospital New Bedford, Mass. <<<>>> Title : Mediocrity, my foot Author : Douglas E. Perso Source : CW Comm FileName: personle Date : Jan 23, 1989 Text: I take strong exception to Frank Sweet's statements in ``Database directions'' [CW, Nov. 28] that ``SQL's syntax is mediocre at best'' and in his suggesting that ``it has so many different ways of accomplishing the same thing . . .'' is a negative attribute. I have found SQL syntax exceptionally easy to learn and teach. SQL is a clear departure from anything capable of being labeled ``traditional.'' Rather than being a programming language, SQL extends the capabilities of other languages by providing a set of database access statements that blend into most languages smoothly. It is easy to describe SQL as ``fat and complex'' without genuine comparisons with other approaches. Having had several opportunities to solve test problems with both SQL and several of the so-called fourth-generation languages, I have found that SQL and a good structured language such as PL/I or IBM's Rexx performed rings around the slow and inflexible fourth-generation languages. Douglas E. Person Senior Technical Specialist Broadway & Seymour Charlotte, S.C. <<<>>> Title : A (data) space odyssey Author : Charles P. Lecht Source : CW Comm FileName: lecht109 Date : Jan 23, 1989 Text: At a time when microcosmic space consumes the attention of most computer scientists, it is interesting to imagine the macrocosmic space above our Earth as a computer data storage medium. Suddenly, we are relieved of dealing with the mysterious unseeable inner space of a centimeter-square sliver of silicon on which a hundred miles or more of microthin circuitry has been inscribed. Instead, we soar into outer space to consider a world we can see, one we've been watching since time immemorial. In the silicon world this year, 1M-bit chips have been joined by 4M-bit chips. 16M-bit chips are no longer uncommon, and 64M-bit chips have made their appearance. A 64M-bit chip can contain more than 3,000 pages of text at 300 word/page and 8 char./ word. Scientists are digging faster and deeper into their microcosmic world to create chips of ever larger capacity and ever smaller size. But this world is not endlessly small. Not too far away lies the final realm they can ever hope to reach, where distances are measured in fermis _ one trillionth of a meter. At this size, the instruments they must use, like the physics they must employ, are too crude to handle anything smaller. Scientists will have reached the practical limits of their microcosmic universe by the end of this century unless an unforeseen opening to another universe presents itself in their doing so. However, I wouldn't count on this. Scientists who are exploring the realm of disk-based storage media are not only bound by the same problems facing the chip folks, but they must deal with electromechanical contraptions in the devices that present much fiercer limitations. Spinning wheel That we must use spinning disks _ forget about tapes _ to capture and retrieve data reveals just how archaic today's computer systems are. Until relieved of their moving parts, computers will always fail because of reasons far more basic than the advanced physical chemistry suggests. Even the availability of multigigabyte disks or the possibility of a terabyte of information on a disk won't change this situation. The need to create, operate and maintain these disks on a whirring platform is severely limiting to their usefulness and reliability. However much we expect to store on any single Earth-bound chip, it is clear to me that this capacity isn't enough, and we would do well to look elsewhere for future computer memory technologies. Where? The place I have in mind employs a telescope instead of a microscope to unravel its mysteries. It is the space around this globe. As I see it, what might practically be contained on a chip today or in the future is but a mere fraction of what might one day be contained on a single channel in wideband radio and/or in the light-wave frequencies we send into space. The data storage possibilities available in the medium we usually refer to simply as ``space'' boggles the mind. For example, between a very high satellite and Earth, we could place just about all the data we have ever created or will create in the future. Broadcast as radio waves or laser beamed as light waves, data could be placed in continuous loops that bound and rebound between reflective facilities on both the Earth and the satellite, to be cycled forever _ however long that is _ and to be drawn upon when needed. Since the number of such loops could virtually be unlimited, so could the amount of data these could contain. And if the satellite was capable of also storing data, it could hold some in a supercooled memory bank employing scores of chips at virtually no refrigeration (space is cold) or electrical (because of solar power) cost. All the data we want or need could reside in an electronic veil of continually moving radio waves and photons fed from databases on the Earth and on the satellite. Our capture and use of such data would involve no more time than we would experience here on Earth in dealing with a very large hierarchically organized database: in fact, probably far less. Let's face it: The thousands of disk data files being used by organizations such as governmental census bureaus in heavily populated countries are inaccessible, or access time can run into days, weeks or even months. But we are not limited to people-made satellites for reflective media. There are the natural heavenly bodies, too. Of course, depending on how far away the reflecting body is in space, retrieval may not be as fast as we'd like. A trip by data to our planetary bodies or even farther before its return to Earth could be measured in minutes, hours, days and years. But the volumes that could be stored grow proportionately to the distance to their target reflector, so there is a payoff in using these, too. Talk about archival storage. Spacing it out Of course, the implementation of space data loops in today's technological environment is not possible. The means to broadcast, maintain and use the databases these might contain are unavailable to us. And broad bandwidth communications facilities with proper signal-to-noise ratios that are unaffected by phenomena such as a raging storm on the sun are unavailable, too. Laser technology, which is most promising in its capability to be unaffected by natural phenomena, is in its infancy. But we can dream of the day that the necessary technologies exist and speculate on what effect they may have on our world of computer science _ if not just our world. By Charles P. Lecht; Lecht is an IDG News Service correspondent based in Tokyo. <<<>>> Title : Standards still missing i Author : Amy D. Wohl Source : CW Comm FileName: wohla Date : Jan 23, 1989 Text: As we begin this year, PC power is an ordinary fact of business life. The question isn't whether the average worker will get one, but when and which one. Yet the real value of a PC probably isn't the information that the worker can personally generate, but rather its ability to connect him to a world filled with information other people create. We all want and need to be a part of a seamless environment _ a place where all the information we could possibly want is somehow effortlessly available. Is this desire a fantasy? Perhaps for now it is. Today, there is plenty of information available. In fact, there sometimes seems to be far too much. Two days away from the office and I return to three inches of paper mail, piles of telephone messages and screens full of electronic mail. I might argue that what I need is less information _ not more. Our problem is that we want both: We want all of the information to be available, but we want it to be available in a form that permits it to be filtered, analyzed and manipulated without our needing to be intimately and personally involved in each tedious step in the process. We can do some of that right now, but the many forms and formats the information can come in makes the amalgamation process not only tedious but potentially very difficult. The underlying information for a single business document could come from a dozen different sources, each stored on a different machine, each with its own operating system, applications programs and so on. The missing link We are ready to use information from anywhere in our information universe, but the information universe isn't quite ready for us. What is missing? Standards. We need to be certain that everything we have ever created on a computer, and anything we can force-fit into it through such miracles as image scanning, optical character recognition and voice recognition, can be readily understood for later retrieval and rearrangement. We will never all use exactly the same machine running exactly the same software _ if history is a good predictor _ but we can demand that our computer systems behave as if this were the fact. Standards _ rules that permit information to be readily understood across multiple hardware platforms, operating systems and application environments _ could accomplish this desirable goal. Several already We already have any number of de facto standards. Most of them are built around the widespread acceptance of a dominant software product, such as Lotus' 1-2-3 or Ashton-Tate's Dbase. IBM's Document Content Architecture (DCA) word processing format is an important de facto document interchange standard. Nearly every word processing and word processing-related product _ for example, desktop publishing _ knows how to export DCA-formatted documents and how to accept them as imports. In fact, minor word processing software often performs the interchange function via its compatibility with DCA. But suddenly, DCA is no longer enough. Documents once restricted to characters and numbers, formatted in typing-like styles, have been superseded in many offices by compound documents. Such sophisticated ``New Age'' documents may include multiple fonts and type sizes, embedded graphics and formats, such as those found in newspapers, magazines or other printed material. They might also include excerpts from spreadsheets and databases, voice annotations or references to animated materials from a laser disk or compact disk with read-only memory. Digital Equipment Corp. is prepared to set the standard for this new kind of document and has recently published its own compound document architecture. So far, we haven't heard vendors jump onto DEC's standard bandwagon. But sometimes a de facto standard emerges slowly, taking months or years to gain acceptance, such as Ethernet. Note that this evolutionary creation of a standard stands in sharp contrast with some earlier attempts. When IBM announced Distributed Office Support System (Disoss) in the early 1980s, word processing vendors were so sure IBM could set a de facto standard that dozens immediately announced their compatibility intentions. Disoss set a de facto standard without ever garnering any appreciable market share. Perhaps this lesson _ which required vendors to deliver on an expensive commitment with little commercial value _ makes today's developers reluctant to identify potential de facto standards without waiting to see marketplace reaction develop. Even though word processing standards are now set in the PC software industry, IBM is still an important factor in this market. IBM's Displaywrite continues to enjoy substantial market share, and many expect IBM to have much more to say on the subject of compound document architectures (CDA). DEC has been an important player in the office automation market with All-In-1 but has never been a high-profile rulemaker in the word processing game. CDAs are, of course, about much more than processing words. They are really about using documents as containers and vehicles for holding and transporting information. Such documents will surely include words, but they will also include graphics, pointers to databases and other application programs, sophisticated formats and dozens of new information objects (voice? animation?) that are still immature and relatively undefined. Hindsight is 20/20 The vendor or group of vendors that sets the standard for compound document architectures for the next round will have important advantages: Buyers will perceive that vendor as the market leader or standards bearer. Its products will attract more third-party incremental software _ making it much more valuable _ than that of other, nonstandard vendors. It will be in a better position to manipulate the standard to best show off and support its own product offerings. That prize is worthy of much effort, and winning the compound document standards battle is likely to attract a number of worthy opponents. But if from the battle a standard emerges, the real winners are the users; their desktop PCs become infinitely more useful as more and more information can be easily, readily and transparently accessed and combined. By Amy D. Wohl; Wohl is president of Wohl Associates in Bala Cynwyd, Pa., and editor of ``The Wohl Report on End-User Computing'' newsletter. <<<>>> Title : Contel Business Systems, Author : CW Staff Source : CW Comm FileName: hwcontel Date : Jan 23, 1989 Text: Contel Business Systems, Inc. has unveiled a line of 32-bit computers with enhanced applications for the restaurant industry. Dubbed the Contel Solution/1 series, the Intel Corp. 80386-based system reportedly features multiuser capabilities and performs restaurant-to-restaurant networking functions. The main processor operates under the AT&T Unix System V Release 3 operating system, according to the vendor, and the coprocessors run within a proprietary operating environment. Contel Solution/1 computers range in price from $12,000 to more than $40,000, depending on system configuration and cabling requirements. Contel, 1 Dedham Place, Dedham, Mass. 02026. 616-461-6400. <<<>>> Title : Symbolics, Inc.'s Graphic Author : CW Staff Source : CW Comm FileName: hwsymbol Date : Jan 23, 1989 Text: Symbolics, Inc.'s Graphics Division has announced an add-on rendering system designed to allow present users to increase rendering power in a cost-effective way. The 3653 dual-purpose graphics system reportedly allows users to install up to three CPUs in the 3653 computer. The add-on CPUs can also serve as black-and-white workstations for modeling and animation design, according to the vendor. The 3653 includes one CPU with 4M words of memory and a 380M-byte enhanced small device interface disk. It is priced at $59,850 from now until March 31. Additional 3653 CPU upgrades are available for $35,950 each. Symbolics, 1401 Westwood Blvd., Los Angeles, Calif. 90024. 213-478-0681. <<<>>> Title : Real Time Enterprises, In Author : CW Staff Source : CW Comm FileName: hwrealti Date : Jan 23, 1989 Text: Real Time Enterprises, Inc. has expanded its RTE Optical Disk File Manager (ODFM) product family. The ODFM line was designed to provide on-line access to the Optimem non-erasable optical disk drive for Apollo Computer, Inc. computers. The latest addition, the ODFM-300, reportedly offers support for the new Optimem drive, the Model 2400. The double-sided ODFM-300 has 1.2G bytes/side, the vendor said, and is suitable for all applications that require random access to large databases of permanent or historical data. The ODFM-300 costs $20,060. Real Time Enterprises, 3000 Winton Road S., Rochester, N.Y. 14623. 716-427-8090. <<<>>> Title : Spectrum Information Syst Author : CW Staff Source : CW Comm FileName: swspectr Date : Jan 23, 1989 Text: Spectrum Information Systems Progamming, Inc. has announced that its System/36 Distribution and Light Manufacturers application software will be available in native mode for the IBM Application System/400 mid-range computer. Scheduled for delivery in April, the base package with source code will be priced from $32,000 to $56,000, depending on processor model. Spectrum, 1990 S. Santa Cruz, Anaheim, Calif. 92805. 714-937-1311. <<<>>> Title : Wang 800 line rings few b Author : CW Staff Source : CW Comm FileName: dwang Date : Jan 23, 1989 Text: With its new mail-order program, Wang Microsystems aims to make ordering equipment as easy as making a toll-free telephone call, but MIS managers say they are not ready to let their fingers do the walking. For many information systems purchasers, the direct-dial ordering concept triggers mail-order flashbacks and nightmares: late deliveries, no service or support and nasty surprises when they open the shipping boxes. In addition to mail order's tainted image in the minds of many MIS managers, the Wang Laboratories, Inc. division has its own image problems to overcome if it is to call MIS managers to the phone. Wang user Joseph Ianello, corporate MIS manager at Amerada Hess Corp. in Woodbridge, N.J., said, ``Any 800 number in and of itself gets no reaction from us.'' ``Maybe in a different situation, it would interest me,'' said Jean Gibour, MIS manager at EIS Brake Parts in Berlin, Conn. ``But we deal with the local distributor, who supports their products and is nearby. I've gotten a little spoiled by that.'' However, Wang's latest marketing approach may be its best chance of broadening its presence in the personal computer market. Analysts have noted that Wang has been locked out of many PC distribution channels, which has impeded the company from breaking into the corporate environment in many instances. Bruce Stephen, PC analyst at International Data Corp., a Framingham, Mass.-based market research firm, said that considering Wang's obstacles in get ting its PCs to market, mail order may turn out to be a viable alternative channel for the company. Stephen noted that the mail-order approach is probably better suited for buying software than hardware because of the rigors involved in shipping equipment by mail. Despite the risks, companies such as Dell Computer Corp. in Austin, Texas, and many smaller PC vendors have managed to carve out a viable stake in the PC market using the 800-number technique, Stephen said. IDC statistics indicate that Dell sold 1.2% of the PCs shipped in the U.S. last year. But while the PC market is growing, Wang's market share shriveled to 0.7% last year. That decline makes it that much more important for Wang to succeed with a mail-order strategy, Stephen said. Wang officials said that the effort will soon be complemented by large-scale advertising and direct mail campaigns. They insisted that by standing out as the only major computer vendor in a world of mostly no-name and supportless mail-order houses, customers will be ``compelled'' to order Wang equipment. But MIS managers warn that Wang will not succeed in a mail-order world by name recognition alone. Brian Ellis, MIS director at bowling and billiards equipment maker Brunswick Corp. in Muskegon, Mich., said that he has mixed feelings about mail-ordering PC equipment. ``You get decent discounts but also a lot of broken promises,'' he said. Ellis said that, in his experiences with mail orders, lack of inventory and late deliveries have disenchanted him from ordering by mail again. However, he said the merchandise usually arrives in good shape. But while MIS managers may not feel altogether secure in ordering computer equipment by mail, IDC's numbers indicate that managers still continue to use mail ordering as an alternative to traditional distribution channels. According to IDC, PC mail orders now represent 5% of sales and are expected to grow. By William Brandel, CW staff <<<>>> Title : Frank bucks the big guns Author : Douglas Barney Source : CW Comm FileName: home Date : Jan 23, 1989 Text: Cut Frank a break. Frank Rogers thought he would do something good for personal computer software customers. So after hours he wrote a $69 Ashton-Tate Dbase-compatible program, quit his job as an MIS director and formed 1 on 1 Computer Solutions, Inc. in Trumbull, Conn., to market the thing. Based on the letters Frank received from customers (and forwarded to us), 1 Plus 1 = 3 is quite a value. But Rogers may well wonder whether it was wise to resign from a comfortable position as an MIS manager to enter the cutthroat world of micro software. First Frank got into a beef with Fox Software. He bought the Foxbase Dbase-compatible compiler to develop applications and called the company to ask if there was a limit to what he could write. The answer was no. So Frank used Foxbase to write his Dbase compatible. Fox didn't like it, but it had to live with it. Now Frank is in danger of offending Ashton-Tate. In a magnificent display of saber-rattling, company Chairman Ed Esber has cast a dark cloud over Frank's decision to take on the database management system giant. Not only did Esber sue Fox Software _ a company he almost bought _ for copyright infringement, but he has made threats against other Dbase-like products, including, by definition, little ol' 1 Plus. We asked Esber if there was a mechanism by which a little guy like Rogers could submit a product to Ashton-Tate to find out if it infringes on Tate's still untested copyrights. Sorry, Esber said. He's on his own. Rogers is undeterred. He is working on a new version with a hot new menu system that is quite different from Dbase. But he now faces a new dilemma. If Rogers fades away, Ashton-Tate can prove no damages and likely won't sue. If he becomes successful, then Ashton-Tate may bring in the lawyers. Of course, healthy sales would constitute a healthy defense fund. A PC is just a phone call away. There is plenty of zirconia and polyester being hawked on the Home Shopping Network cable television show. But there is also an odd array of obsolete and liquidated PCs up for grabs. The Wang Office Assistant _ a machine that offered everything a PC should offer except compatibility _ has shown up, along with old Xerox PCs and an assortment of NEC and Epson laptops with tiny LCD displays designed to ruin the best of eyes. In short, you know the machine is a dud when you can buy it on TV. Therefore, it should come as no surprise to see the IBM PC Convertible laptop on sale. At first blush, though, the price was a shocker. A tag of $998.77 with a printer, battery pack and 512K bytes of random-access memory was pretty tempting for this dual-floppy dog that the show said lists for $2,129. In fact, it was so tempting that 660 were sold in about 15 minutes. With the help of that pretty Home Shopping Network lass fingering the memory card, IBM probably doubled its entire Convertible sales in those 15 minutes. But on closer inspection, this price really isn't so hot. After all, the Convertible has more than a few canine qualities, and it is clear that IBM is fire-saling these babies. In fact, the original machine with 256K bytes now lists for a paltry $500, and that printer goes for a mere $142. So $1,000 is more than a bit steep. If the Home Shopping Network would only offer real bargains and start selling ATs! Sorry, Mort _ you're wrong. Corporate Software founder Mort Rosenthal makes a nice living selling software and support to Fortune 1,000 firms and seems to understand user concerns pretty well. But there is one point Mort has made that rubs against the grain of what Computerworld and MIS are all about. In a recent column in these pages that otherwise was completely on the money, Mort pleaded with the user community not to become the technology gurus but instead to rely on analysts, vendors and resellers to unearth the secrets of future technology. That's plain silly. Resellers care about money, vendors too often develop products in a vacuum and analysts often astound the world with their lack of insight. On the other hand, MIS actually uses the technology and knows better than any other group what the future should be. By Douglas Barney; Barney is a Computerworld senior editor, microcomputing. <<<>>> Title : Patents put on CD-ROM Author : CW Staff Source : CW Comm FileName: cdisc Date : Jan 23, 1989 Text: This month, the U.S. Patent and Trademark Office (PTO) will inaugurate a program using compact disk/read-only memory technology that promises to save taxpayers in excess of $300,000 each year. Previously, the PTO had been using an on-line patent information data service to disseminate abstracts of patents, files on corporations holding patents and other information to its nationwide network of 62 patent depository libraries. With the cost of the Classification and Search Support Information System (CASSIS) on-line service rising at an annual rate of more than 20% in recent years, the on-line costs were cutting deeply into the PTO's budgets. The costs had become so prohibitive that the PTO was forced to stop expanding its network of libraries in 1985, explained Willam Lawson, director of documentation of the PTO. Last year, the PTO decided to explore whether CD-ROM technology could act as a substitute for some portions of CASSIS and reduce CASSIS on-line costs. During a period of several months, 10 patent libraries field-tested computer hardware and a CD-ROM disk containing a five million-record CASSIS database, the full text of the 120,000 titles in the U.S. Patent Classification Manual and a file of approximately 100,000 corporations holding patents. The test disks were mastered by Dataware Technologies, Inc., a Cambridge, Mass.-based developer of authoring and retrieval software for use with CD-ROM applications and services for CD-ROM publishers. Within three months, the test sites saved more than $175,000 on their on-line costs, strongly suggesting that the CD-ROM information retrieval service could entirely replace the present on-line system, Lawson said. With the successful completion of the test program at the 10 patent libraries, the PTO now plans to install CD-ROM systems in all of its libraries by the end of the year, Lawson said. The government office also intends to issue two other test disks with customized development and text-retrieval software and then begin regularly updating the disks and publishing them every two months. More than money Beyond the dollar savings, projected to be $300,000 per year beginning in 1989, the CD-ROM information retrieval programs will permit the PTO to resume expanding its network of patent libraries at a relatively low fixed cost and to open libraries in areas of the country that are not well-served, including 10 states that do not have patent libraries, Lawson said. The PTO will use Dataware Technologies' CD Author, a development system, and CD Answer to retrieve, manage and produce information from CD-ROM disks. The software runs on an IBM Personal Computer AT or compatible with 512M bytes of random-access memory and a 20M-byte hard disk drive running Microsoft Corp.'s MS-DOS 3.0 or higher. Dataware Technologies, however, recommends that it be run on an Intel Corp. 80386-based machine with as much as 4M bytes of RAM and a hard disk with four times the capacity of the expected database volume. By Michael Alexander, CW staff <<<>>> Title : What makes the grade in r Author : CW Staff Source : CW Comm FileName: 116revue Date : Jan 23, 1989 Text: Remote communications software is similar to traditional communications programs because you can perform remote telecommunications. In addition, you have the ability to remotely control another computer as if it were your own. The five packages in this review and comparison share a number of features, including phone directories, chat windows that allow interactive on-line conversations between the remote and host users, remote printing, file transfer and security. Meridian Technology's Carbon Copy Plus Version 5.0 In addition to remote communications functions, Meridian Technology, Inc.'s Carbon Copy Plus Version 5.0 also provides a complete, full-function terminal emulation program. Features: Carbon Copy Plus contains a host program, CC, which runs all shared applications, and a remote program, CCHelp. The host and remote programs included in one Carbon Copy package cannot be used together. Carbon Copy Plus' powerful terminal emulator allows you to connect with mainframes, on-line databases and bulletin boards. Another nice touch is a script language, which automates recurring communication activities and supports Honeywell Bull, Inc.'s Kermit, Xmodem, and Ymodem file transfers. A compiler checks script syntax and creates a machine-language module that quickly executes scripts. Performance: Satisfactory to very good. Performing our standard benchmark tests, Carbon Copy Plus earns a satisfactory score for speed. The wide range of compatible modes and the cross-mode compatibility earn the product a score of very good in compatibility. It is the only program in this review that was capable of working with IBM's Enhanced Graphics Adapter (EGA) on one machine and Color Graphics Adapter (CGA) on another. However, we did run into a few problems. Security features include a password table that holds valid passwords and phone numbers for fielding incoming calls to the host. An option is available to stipulate the number of password attempts allowed. Another feature enables you to grant varying degrees of host access. Carbon Copy Plus' logging facility will track sessions according to such specifications as Calls Completed, or it will run a continuous audit trail. The Session-Capture feature records an entire session. Documentation: Good. The manual includes an index, a table of contents, a glossary, a troubleshooting appendix and a quick-reference guide. We stumbled around several times looking for words in the index, and a few references directed us to the wrong page number. Ease of learning: Satisfactory. Carbon Copy Plus takes more time to learn than the other programs in this comparison. We had to struggle with the print redirection, and installation involves a lot of steps. Ease of use: Good. Once you have mastered the vocabulary and set your initial parameters, Carbon Copy Plus is not very mysterious to use. Error handling: Good. The one lockup problem we had was correctable, and all other error-handling capabilities proved to be adequate. Support: Good to very good. Carbon Copy Plus comes with a 90-day media warranty and a bulletin board that provides patches, limited information and message collecting. Technical support hours are from 9 a.m. to 6 p.m. in both Eastern and Pacific times. We found Meridian's technical support staff to be courteous and eager to help. Value: Good. Each package sells for $195: Two packages are required. Although it is not as easy to learn as other products in this review, it is a logical choice if you want an all-in-one communications/remote program. Norton-Lambert's Close-Up Version 3.00A Norton-Lambert Corp.'s latest release of Close-Up Version 3.00A is especially useful as a remote support tool. Features: Close-Up is sold in two parts: Customer/Terminal for the host computer and Support/ACS for the remote system. The Automated Communications System (ACS) is a simple but powerful script language that lets you program the support program to perform unattended tasks. Close-Up provides a traditional terminal emulator that lets you communicate with any other computer. Its special feature is its capability to run in the background. Performance: Good to very good. Although Close-Up is a little slower at most things than the other programs in this comparison, it earned a good rating in our speed tests. As far as compatibility, Close-Up ran everything we attempted without a hitch, including graphics. You can set up passwords with varying degrees of file-transfer rights. You can also set up a callback routine for any of the passwords and even specify commands or applications to run automatically once connected. During remote operations, you can disable the host screen, keyboard, or both. Close-Up keeps normal session logs under whatever file name you want. If you specify the name of an existing log file, the new information is appended to the end of that file. ACS keeps an additional log for script files. You can also videotape your sessions and take snapshots of the customer's individual screens. Documentation: Satisfactory. Each module comes with a softcover, ring-bound manual. In place of a tutorial, there are several step-by-step procedural outlines. The table of contents is impressive, but the index is weak. There is no on-line Help. Ease of learning: Very good. Unless you are a complete novice, you will be able to use Close-Up almost as soon as you get it out of the box. Ease of use: Very good. Close-Up's Lotus Development Corp.-style menus and pull-down submenus lead you through hardware configuration and basic program operation. Error handling: Good. When we tried to force errors, Close-Up provided the appropriate error messages. We had trouble with our EGA and CGA display mismatch. Although the program never locked up on us, the CGA caller was left staring at a blank screen. Support: Satisfactory to very good. Norton-Lambert provides a 90-day media-only warranty. Technical support is available via a toll-free number Monday through Friday between 7 a.m. and 4 p.m. Pacific time. When we called technical support, the technician was immediately available on four of five calls. Value: Good. Priced at $245 for the host and $195 for the remote, this is the most expensive package in this group. However, it does offer extensive features and reliable operation. Co-Compute Version 2.14 Harmony Technology Associate's Co-Compute was designed primarily for collaboration, training and conferencing. Features: In addition to the standard host/terminal operating mode, Co-Compute offers a symmetric mode in which only keystrokes are transmitted over the phone lines. To do this, the same application is run on both machines in synchronization. This speeds things up by eliminating the transfer of entire images. Performance: Satisfactory to excellent. We performed our tests for Co-Compute in its symmetric mode. All of our standard tests responded as if you were sitting at the local machine, earning Co-Compute excellent marks for speed. You should be able to do everything you can do on your computer while in Co-Compute's symmetric mode, as long as your partner's computer is equally equipped. We could not get a screen-image capture with Imcap, a screen utility used to record software screens, while the conferencing menus were displayed. We were also unable to load Aldus Corp.'s Pagemaker remotely or work with Microsoft Corp.'s Word in either text or graphics mode. Co-Compute rates satisfactory for compatibility. Although Co-Compute does not support any identifiable security features per se, all remote functions must begin with a voice call. Session logging and recording features are not supported. Documentation: Satisfactory. The manual does a fine job of describing the Co-Compute features but lacks an organized summary of how you make them work. The manual also lacks screen shots and enough illustrations to support the text. Ease of learning: Satisfactory. Co-Compute takes a while to learn, mostly because of the symmetric mode. Installation is straightforward. Ease of use: Good. Once learned, using Co-Compute is quick and easy, although keeping in sync in symmetric mode takes some getting used to. Error handling: Very good. We did not lock up while using the program. Co-Compute is relatively foolproof. Support: Good to excellent. Harmony offers a 90-day media warranty with a toll-free support number. Support hours are from 8 a.m. to 6 p.m. Pacific time, Monday through Friday. We called technical support a couple of times when no one was available. Our calls were promptly returned. The staff answered our questions clearly and accurately. Value: Good. The package sells for $123.75 per module. Two modules are required. We do not think it is the program of choice for support applications, but we like it for collaboration or training. DMA's PC Anywhere III Release 3.00 DMA, Inc.'s PC Anywhere offers all features of a first-rate remote-control program. Features: The PC Anywhere package contains separate program disks for the host and remote computer; Anywhere runs on the host system and Aterm runs on the remote system. However, you can use almost any communications program on the remote computer. In fact, the remote computer does not even have to be an IBM Personal Computer compatible. PC Anywhere has built-in support for more than 30 terminals. The automatic mode lets Anywhere sit in the background and wait for calls, but it leaves you without direct control over whether a call will be answered. The resident mode puts Anywhere in the background on standby, waiting for you to press the hot keys to access the menu. The menu lets you direct Anywhere to wait for a call, initiate a call or put Anywhere into the automatic, resident or disabled mode. Performance: Good to very good. For our standard benchmark tests, PC Anywhere was on par with most of the other remote products, earning a good rating for speed. We tested the package with a variety of software with little conflict, earning another good rating for compatibility. PC Anywhere's security functions let you designate a master password, up to 64 individual passwords or no passwords at all. You can also specify commands to be executed as soon as a caller is connected. The session-logging routine records who called, the session start and stop times and the operator's comments. You can prepare formatted activity reports and billing reports and print them to a printer, screen or files. It also includes an archiving function. Aterm's session-recording function lets you take snapshots of screens. Documentation: Good. PC Anywhere's manual is professionally prepared and well laid out. The table of contents and index are complete, but there is no tutorial. On-line Help is limited. Ease of learning: Very good. Most procedures are intuitive or clearly prompted. Users with basic computer knowledge should be up and running in a few hours. Ease of use: Very good. Most operations are intuitive and clearly prompted. Error handling: Satisfactory. In most cases, PC Anywhere handled errors smoothly. However, we did encounter trouble with the voice-to-data switching at 2,400 bit/sec. and with our IBM EGA and CGA display mismatch. Support: Poor to satisfactory. DMA licenses PC Anywhere on an as-is basis without a warranty. It offers technical support Monday through Friday, 9 a.m. to 5 p.m. Eastern time via a toll-free call. We had difficulty reaching technical support by phone. When we did, the assistance was fairly helpful. Value: Good. If you need a program to run your own computer remotely or provide support to a single client, the $145 PC Anywhere package is as good as any. Crosstalk Communications' Remote 2 Version 1.00 Crosstalk Communications Co.'s Remote 2 is one of those all-too-rare packages that feels right while you are using it. Features: Crosstalk markets Remote 2 as a two-module set, including R2Host for the host users and R2Call for the remote user. You can buy the modules separately. You can also use Crosstalk Mk.4 or Crosstalk XVI on the remote computer with minimal loss in capability. If you do not have either of these programs, you can use a terminal emulation program _ or even just a simple dumb terminal. R2Host runs in one of three modes. The Always Ready mode puts R2Host in the background to answer incoming calls automatically. The Manual mode is similar, except that the host operator must direct R2Host to answer calls. The Restart runs in the foreground and turns the host computer into a dedicated host, unavailable for local use. Remote 2 lacks voice/ data switching, session recording and a call-out capability for the host. Performance: Satisfactory to very good. Remote 2 is slower than the others at updating text displays. However, it paints graphics screens faster than any of the rest. We rate Remote 2's speed as good. Imcap, a screen utility used to take pictures of software screens, was unable to capture on-line R2Host messages, earning a satisfactory rating for compatibility. Security options let you make Remote 2's host as secure or as open as you like. Session logging is less detailed for Remote 2 than other products in this review. R2Call logs outgoing calls, recording the host name, time, date, duration and comments. R2Host logs each caller's user ID, time, date, duration of call and unsuccessful logon attempts. Session recording is not provided. Documentation: Very good. Remote 2 comes with two well-written ring-bound manuals. They include detailed tables of contents and indexes. Clear instructions lead you through each function, explaining processes fully. In addition, Remote 2 has on-line context-sensitive Help. Ease of learning: Excellent. Users with a moderate level of computer communication experience should be running their first successful on-line sessions within a few hours. Installation is quick and easy. Ease of use: Excellent. We were impressed with how well things work and fit together. Remote 2 does not require you to remember much more than a couple of hot keys. Error handling: Excellent. Remote 2 either handled our errors without a problem or, in the case of mismatched bit/sec. rates, would not establish the link. Remote 2 uses a special error-correction routine to eliminate noise on bad phone lines, which can be disabled if the line is error-corrected. Support: Poor to satisfactory. Remote 2 is sold without a warranty. Free technical support is provided for registered owners, Monday through Friday, from 9 a.m. to 6 p.m. Eastern time via a toll-free call. The technical support number was busy nine out of the 10 times we tried to call during a period of three days. Value: Excellent. Remote 2 sells for $195 for the host and remote package, $129 for the host only or $89 for the remote only. We miss the capability to switch between data and voice, record sessions and originate calls from R2Host but are still impressed with the package. <<<>>> Title : CASE tool facilitates Win Author : CW Staff Source : CW Comm FileName: casew Date : Jan 23, 1989 Text: Personal computer programmers writing for Microsoft Corp.'s increasingly popular Windows interface now have a tool to help surmount the development hurdle. The graphical nature of Windows adds complexity to applications programming, a vexing situation for those used to character-based interfaces. CASE:W from Caseworks, Inc. in Atlanta, Ga., acts as a bridge, easing the transition of today's applications to the graphical environment. The $1,495 CASE:W Version 1.0 helps generate code that is specific to the Windows environment. With CASE:W, programmers still write the code for the core functions of an application; after that, the Caseworks tool generates code to create menus, windows and dialog boxes and to provide memory management. This has been the most difficult portion for many programmers used to older, character-oriented environments. The product is created from Windows code sets and production rules and includes a complete programming environment that generates concise, pretested code, officials said. It also includes a front-end prototyper that gives the user a way to describe the application program's windows and controls. The prototype window program specifies for the user the essential characteristics for the program's main window and menu system. CASE:W then automatically generates the program code files and produces the files necessary to develop the remainder of the program. Other features CASE:W reportedly supports many of the Windows controls such as menu bars, pop-up menus and dialog boxes. The product's design modes can extend to interface with dialog boxes not linked by the program's menu system. CASE:W also generates the Windows routines to process the dialog boxes. CASE:W runs on Intel Corp. 80286- or 80386-based machines with at least 640K bytes of memory. A hard disk drive is highly recommended. It also requires the Microsoft Windows Software Development Kit, C Compiler Versions 5.0 or higher, Make Utility and Linker and a text editor compatible with DOS or Windows. By William Brandel, CW staff <<<>>> Title : A little too careful? Author : Clare Fleig Source : CW Comm FileName: fleigcol Date : Jan 23, 1989 Text: In the wake of the Internet virus plague, information systems managers are rushing to improve network security, and vendors are in an equal hurry to provide new and more sophisticated tools for doing so. This could lead to trouble, because while everyone is concentrating on plugging leaks, they may neglect to balance the need for security against the user's need for information and productivity levels. In situations like the Arpanet attack, companies and MIS managers in charge of communications often respond like the farmer whose horse was stolen _ by locking the barn door after the thief had struck _ with security reviews and added precautions. But in their haste to guard against trouble, IS departments can create such a maze of security that it discourages users from taking advantage of the network. If users are faced with numerous barriers to access and numerous passwords to memorize, they may respond by simply not using the network, defeating its raison d'etre. This is particularly true of casual users who do not have a pressing need to access network resources and top executives who are only too ready to delegate onerous tasks to subordinates. Thus, we have the need for a balance between security and accessibility. ``You obviously can't allow every user to have free run of the system, but at the same time, you want a network that users feel comfortable with,'' says a DP manager at a San Francisco-based travel agency. Achieving that balance involves weighing a number of factors, including the size of the organization, the sensitivity of the data on the network, the sophistication of the user community and the number and types of networks and users on each system. It also requires taking a whole new look at security and how it is handled in the organization. All too often, security measures are focused on the network itself, through the use of file- and system-locking on the server as well as a suite of passwords. But the physical network is only one component of security. Two other key factors often overlooked are the corporate security policy and educating the user community. ``A comprehensive security policy issued from the top can have a big effect on the use of the network,'' says the MIS director of a Chicago-based insurance agency. ``If users down the line see executives employing backup security _ whether it is for copying a disk or for a network _ they're more willing to listen.'' The creation of a comprehensive security plan starts with the MIS department. The first step involves resolving conflicts between servicing user needs and protecting the information base. The goal is a policy that achieves the maximum in user productivity while effectively protecting that information base. Next, MIS should develop a plan to handle day-to-day access operations as well as special circumstances such as planned break-ins, curious hackers and user access to the network from nontypical remote sites. When the policy is drafted, it should be reviewed and then disseminated from top management, so that it is given more weight by network users. Ideally, the security procedures should cover all company DP operations, including the use and copying of floppy disks, desktop computer use and network operations. Goes hand in hand A published security policy goes hand in hand with an education policy that familiarizes users with the need for security. ``When users understand why we need certain levels of security, they are more willing to adapt to it,'' says the Chicago MIS director. He also notes that the publicity surrounding visible computer virus attacks can be advantageous in reminding users about the importance of secure networks. Designing a security policy also means facing one indisputable fact: No network, regardless of the security precautions taken, will ever be fully secure. Any program that a person can devise, another person can eventually crack. Given that axiom, the best defense MIS managers have at their disposal is a three-pronged approach: a formal security policy that takes into account the individual trade-offs between security and network usage, an educated and informed user base and a secure physical network installation. By Clare Fleig; Fleig is director of systems research specializing in local-area networking and IBM communications for International Technology Group in Los Altos, Calif. <<<>>> Title : Joint effort opens E-mail Author : CW Staff Source : CW Comm FileName: aialan Date : Jan 23, 1989 Text: WASHINGTON, D.C. _ A pressing need to communicate electronically with business partners at a reasonable cost is what prompted one trade association's successful bid to get a prominent group of electronic mail providers to agree to support gateways to competitive mail services. The Aerospace Industry Association (AIA) recently met with eight such service providers to hammer out interconnection requirements for participation in an AIA pilot network slated to kick off Feb. 1. The AIA is a trade organization representing manufacturers of commercial, military and business aircraft and related equipment and components. The pilot will involve a select number of AIA members including Hughes Aircraft, Boeing Co. and Northrup Corp. The test will be set up so that each user is connected to a different mail system, with some attached to more than one service. Initially, the pilot will focus strictly on enabling AIA members to send messages _ as opposed to electronic data interchange documents _ to each other and outside trading partners via private and commercial E-mail systems, according to Peter Donaghy, who represented Hughes Aircraft Co. The way to go ``Without interconnection, this process is difficult and may require subscription to multiple mail services,'' said Joseph Dauksyscq, AIA's associate director, operations service. This is precisely what many AIA members are forced to do today and what they hope to avoid in the future. Instead, AIA is asking its messaging suppliers to use the CCITT X.400 standard for interconnection and to provide gateways between proprietary mail systems to X.400 as well as security and quality of service parameters. That will do ``We are not mandating that they all hook to every other service _ just that they support X.400 in the same manner, which will provide the appropriate level of interoperability,'' Donaghy explained. ``We asked if they could have this capability in place by Feb. 1, and none of them balked.'' Participating service providers included IBM's Information Network, GE Information Services, AT&T, Western Union Corp., Dialcom, MCI Communications Corp., McDonnell Douglas Computer Systems Co.'s Tymnet and Telenet Communications Corp. Two other suppliers were invited to participate but declined. These suppliers were chosen because they are the ones currently serving AIA members. By Patricia Keefe, CW staff <<<>>> Title : ISDN still rolling at ful Author : CW Staff Source : CW Comm FileName: isdnwrap Date : Jan 23, 1989 Text: Often derided as an acronym for ``I Still Don't Know,'' the market for ISDN, or Integrated Services Digital Network, picked up a head of steam in 1988, and that momentum has continued into the new year with a spate of announcements and reports. AT&T recently announced what company spokeswoman Daisy Ottman described as ``yet another ISDN building block:'' a Clear-Channel Capability that allows customers to use the full 64K bit/sec. bandwidth within each of the 24 data channels in its Accunet T1.5 service. This paves the way for links between Accunet and ISDN services, since ISDN B channels also carry clear 64K bit/sec. bandwidth. Currently, however, the main benefit of Clear-Channel Capability is increased utilization of available T1 bandwidth, Ottman said. Currently, AT&T provides no link between Accunet T1.5 and international or domes- tic ISDN services. It does, however, provide an ISDN gateway to Accunet Switched Digital Service, which handles 64K bit/sec. Clear Channel Capability is provided as an upgrade to the AT&T network and is available free of charge with new orders for interoffice channels on Accunet T1.5, according to AT&T. Chip maker Advanced Micro Devices, Inc. in Sunnyvale, Calif., has both reduced the price on its Am79C30A Digital Subscriber Controller, which is reportedly a highly integrated chip for ISDN terminals, and shipped the Amlink3, an ISDN software development kit that costs $40,000. Ongoing technology and yield improvements have sliced $5 off the chip price, which now lists at $24.50. The impact on end users is the fact that ISDN equipment and service costs will decline as component expenses are reduced, making them competitive with existing analog and non-ISDN digital offerings. Compaq Computer Corp.'s former subsidiary Telecompaq is said to be rising again as an ISDN software house. Advanced Connectivity Systems, Inc. in Richardson, Texas, which reportedly was founded by refugees from Compaq's now-defunct division, is said to be readying an ISDN-based personal computer software package that will combine electronic mail and voice connectivity features originally developed by Telecompaq, an ISDN user claimed. Release is scheduled for the first quarter of this year. Another ISDN consulting service has emerged, this time from CAP International, Inc. in Norwell, Mass., which has launched the ISDN Architect Strategic Planning Service for integrated networks. The service will focus on market forces, deployment and consumption issues, as well as worldwide ISDN trends. It will also address standards, regulatory impacts, competitive stances and consumer adoption criteria. Users seeking more information about ISDN can peruse either of two reports published by Information Gatekeepers, Inc. in Boston. The first is a report initially prepared for the National Aeronautics and Space Administration by the University of Colorado Center for Space and Geosciences policy. ``NASA and the Challeges of ISDN: The Role of Satellites in an ISDN World'' is now available through Information Gatekeepers for $75. The study outlines ISDN concepts and lists key organizations, the current status of key standard recommendations and domestic and international ISDN implementation progress. It also suggests that ``NASA could work with other space agencies to ensure a coherent posture with regard to the role of satellites in ISDN.'' A second effort, ``ISDN Applications,'' lists and describes more than 70 applications, including some taken from field trials and actual applications used by Southwestern Bell Corp., AT&T, GTE Communications Systems, Ameritech, Bell Canada and US West. The tome includes about 30 applications identified by the national ISDN Users Forum, which, coincidentally, is meeting tomorrow through Thursday in Gaithersburg, Md. By Patricia Keefe and Elisabeth Horwitt, CW staff <<<>>> Title : NCR Comten launches SNA s Author : CW Staff Source : CW Comm FileName: comten1 Date : Jan 23, 1989 Text: ST. PAUL, Minn. _ NCR Comten, Inc. has released yet another product salvo against IBM in its ongoing battle for market share in the Systems Network Architecture front-end processor arena. The NCR Corp. subsidiary announced upgrades that are said to provide the Comten 5660 with more power and greater backup flexibility than a comparable configuration of IBM's high-end communications processor, the 3745. The new High Performance Feature is said to provide Comten 5660s with 50% more throughput, in terms of transactions per second, than comparably configured 5660 models without the enhancement. Comten 5660s equipped with the feature have 80% more throughput than a comparably configured 3745 Model 210, according to a Comten benchmark test. The Model 210 is IBM's single-processor version; Comten gave no comparison with IBM's dual-processor 3745 models. The feature is priced at $60,000 for new systems and $75,000 for an upgrade. Comten also announced that it was cutting the 5660's base price from $300,000 to $175,000. The High Performance Feature allowed Donovan Data Systems, Inc. in New York to increase throughput, connect more lines and add more applications without having to add another front-end processor _ which would be expensive both in terms of hardware and support dollars, according to Ronald Block, database service vice-president. Since the enhancement was added last summer, Donovan Data has 50% instead of 90% utilization of its Comten 5660, he said. Comten also announced a new version of the Universal Communications Adapter (UCA), which reportedly provides more flexibility than older models in terms of backup and data-routing configurations. The UCA acts as a multiplexer for the 5660, handling incoming lines from modems, protocol converters and other communications devices. While the older adapter could only interface with one primary processor and one backup system, the new model can handle up to two backup systems and two active systems, Comten said. Unlike IBM's backup system, which uses a second CPU for full redundancy, Comten's UCA allows the user to choose whether to have partial or full redundancy of processors and adapters, according to Comten Vice-President of Development Ron Groenke. The product is scheduled for delivery in the second quarter, with prices beginning at $21,000. Comten's recent surge of enhancements, which have included an IBM Token-Ring interface, are necessary if it wants to combat IBM successfully at the high end of the communications processor market, said Josh Gonze, a consultant at market research firm International Data Corp. in Framingham, Mass. ``Comten has taken its top of the line and switched in more powerful components, which is the way upgrades should be done,'' he said. A third Comten announcement, the 16-Line Communications Base, is said to handle up to eight 56K bit/sec. lines, compared with the older board's four-line limit. Second quarter delivery is planned at a base price of $7,035, according to the vendor. By Elisabeth Horwitt, CW staff <<<>>> Title : Big guys help Apollo boos Author : CW Staff Source : CW Comm FileName: ncsdec Date : Jan 23, 1989 Text: CHELMSFORD, Mass. _ Apollo Computer, Inc. has been steadily gaining ground in its efforts to make Network Computing System (NCS) an industry standard. Last May, the workstation vendor proposed NCS as part of the Open Software Foundation's (OSF) planned open software environment. Since then, IBM, Hewlett-Packard Co. and, most recently, Digital Equipment Corp. have licensed the software from Apollo. Software tools within NCS allow the user to ``take a single application program, split it up, and have the parts executed on different machines, then brought back together,'' DEC spokesman Peter Kobs said. Different segments of an application can communicate or trade data via remote procedure calls. One of the features that differentiates the product from its competitors is a ``location broker'' residing on one system that keeps track of the location of various software and data elements, according to Apollo senior product manager Saul Marcus. DEC has not committed itself to definite product plans for NCS. However, ``I don't think DEC has any choice'' about incorporating NCS into its Ultrix systems, said Bruce Richardson, vice-president at Advanced Manufacturing Research, Inc. in Cambridge, Mass. Given that the Unix environment is ``inher- ently multivendor, if DEC wants to be considered on par with Apollo, Sun and other Unix workstations, it needs to incorporate leading server technologies,'' he added. ``A lot of users won't take VAX seriously as a [Unix] server because it is too much of a DEC solution,'' Richardson said. While Ultrix systems can exchange files with other types of workstations via Open Systems Interconnect or Transmission Control Protocol/Internet Protocol, they currently have no way to share applications, he added. While DEC does offer distributed processing on its systems, NCS ``is a horse of a different color,'' Kobs said. DEC, he added, has ``symmetrical multiprocessing on a single system'' and some job-sharing on Vaxclusters, but it lacks the distributed functionality offered by NCS. Backing by leading computer vendors should help NCS win acceptance into the OSF environment, Richardson said. Particularly significant are IBM's plans to integrate NCS into its Unix-like AIX operating system, because AIX is already slated to be part of the open software environment. Is it enough? Still in question, however, is whether NCS' growing stature as an industry standard will help Apollo catch up with archrival Sun Microsystems, Inc. in the workstation market, Richardson said. Sun has not produced a successful competitor to NCS in the distributed applications arena. However, Sun's Network File System (NFS), supported by dozens of vendors, has been hailed as an industry standard for several years. Last week, Symbolics, Inc. announced Symbolics-NFS, its own version of Sun's protocol, which will allow Symbolics' artificial intelligence workstations to communicate with a wide variety of workstations and operating systems, the vendor said. Symbolics-NFS is priced at $1,000. Both products are available now for the Symbolics 3600 family of workstations. By Elisabeth Horwitt, CW staff <<<>>> Title : A high-speed data communi Author : CW Staff Source : CW Comm FileName: netcompm Date : Jan 23, 1989 Text: A high-speed data communications controller has been introduced by Computer Modules, Inc. Designated the LSPC Serial/2, the unit is reported to be a dual-channel, multiprotocol, asynchronous/synchronous serial interface designed for the IBM Personal Computer AT bus. The controller has a maximum data rate of 400K bit/sec. and incorporates a 3-byte receive buffer on each channel. The LSPC Serial/2 costs $245. Computer Modules, 2348C Walsh Ave., Santa Clara, Calif. 95051. 408-496-1881. <<<>>> Title : Cabletron Systems, Inc. h Author : CW Staff Source : CW Comm FileName: netcable Date : Jan 23, 1989 Text: Cabletron Systems, Inc. has announced an Extended Media Adapter designed for integrating installed thin-wire devices and network segments into twisted-pair Ethernet local-area networks. According to the vendor, the product allows users of personal computer LANs, Digital Equipment Corp. workstations and other devices with Ethernet ports to utilize the star topology, wire management and centralized network control offered by twisted-pair LANs. The Extended Media Adapter costs $550. Cabletron, P.O. Box 6257, Rochester, N.H. 03867. 603-332-9400. <<<>>> Title : Knowledge Network Systems Author : CW Staff Source : CW Comm FileName: netknowl Date : Jan 23, 1989 Text: Knowledge Network Systems, Inc. (KNS) has announced an office automation platform for Novell, Inc.'s personal computer local-area networks. Called Oasis, the system reportedly protects against several types of security breaches, including computer viruses, by providing a controlled environment at the corporate management level. It also includes an interpretive language facility for application development. Oasis is priced at $2,995 per server. The number of workstations that can be incorporated is determined by the type of LAN. KNS, Suite 1800, 3800 Concord Pkwy., Chantilly, Va. 22021. 703-968-0378. <<<>>> Title : Kortek, Inc. has introduc Author : CW Staff Source : CW Comm FileName: netkorte Date : Jan 23, 1989 Text: Kortek, Inc. has introduced a remote communications software package with Microcom Networking Protocol error-checking features for the IBM Personal Computer and Personal System/2 market. Called Freeway Remote, the software offers a debugger and programmable keyboard control functions. Other features include a voice-to-data switch and a graphics support file. Freeway Remote costs $249.95. Kortek, Suite 302, 460 California Ave., Palo Alto, Calif. 94306. 415-327-4555. <<<>>> Title : Best Data Products, Inc. Author : CW Staff Source : CW Comm FileName: netbestd Date : Jan 23, 1989 Text: Best Data Products, Inc. has entered the Apple Computer, Inc. marketplace with the introduction of the Smart One 2400XMAC external modem. The Macintosh-compatible device offers data transmission rates at 2,400 bit/sec. and includes autoanswer and autodial capabilities, the vendor said. It is priced at $279. Best Data, 5907 Noble Ave., Van Nuys, Calif. 91411. 818-786-2884. <<<>>> Title : Halley Systems, Inc. has Author : CW Staff Source : CW Comm FileName: nethalle Date : Jan 23, 1989 Text: Halley Systems, Inc. has introduced a T1 broadband modem developed for high-speed data links. The Z2000 reportedly provides protocol-transparent, serial data transfers at the T1 rate and is said to be ideal for any application that requires high-speed point-to-point data transmission, including workstation-to-mainframe and private branch exchange links. The Z2000 costs $3,950. Halley Systems, 281 Orchard Pkwy., San Jose, Calif. 95134. 408-434-3500. <<<>>> Title : Two IC/Ps set new pace Author : David Hudson Source : CW Comm FileName: hudsid Date : Jan 23, 1989 Text: Large-scale in-house publishing is made possible by cost-effective high-speed _ 80 page/min. and above _ nonimpact printers. These peripherals are often referred to as centralized intelligent copier/ printers (IC/P). For publishing applications, these devices must be graphics capable and accept cut-sheet paper as opposed to data processing fanfold stock. Duplex, or two-sided, imaging is also required. Last year, both IBM and Xerox Corp. introduced IC/Ps targeted at the high-volume in-house publishing market. In February 1988, IBM rolled out the 3827, the company's first high-speed cut-sheet printer. The copier provides print speeds of up to 92 image/min. with a resolution of 240 by 240 dot/in. List price is $193,000. Seven months later, Xerox, the historical market leader in high-speed cut-sheet printing, introduced the 4090. The 4090 also prints at speeds up to 92 image/ min. but has a resolution of 300 by 300 dot/in. List price is $190,000 for an IBM 370 channel-attached configuration. Although the two printers are positioned closely in speed and price, they take different architectural approaches to resource and document management. IBM's 3827 is based on an LED print engine and incorporates an IBM Advanced Function Printing (AFP) controller. AFP is an umbrella term for IBM's Systems Application Architecture (SAA) printing architecture. Its significance, like SAA's, lies in IBM's move to standardize various protocols across operating systems and hardware platforms. With the 3827, IBM hopes to move its customer base from widely varying printer data streams to the page description language-like AFP formats. By introducing advanced printers with AFP-only controllers, IBM is prodding its customer base to develop new applications, possibly for publishing, and migrate older applications to the new standards. Architecturally, the 3827 has a symbiotic relationship with its driving host. Fonts, forms, graphic images and format command files are stored in AFP software resource libraries on the host. Document data streams are routed to the AFP software driver, the Print Services Facility, which manages resource allocations to the 3827 and converts the data stream to one that is specifically bound to the receiving printer. A two-way printer-host dialogue allows the host to establish the present resource status of the printer to avoid redundant downloading. IBM's host-based approach to printer intelligence offers the traditional advantages of centralized management. Printing resources such as forms and fonts can be directed to any compatible (read AFP) printer on the network. Maintaining a single resource copy also makes version management simple. Disadvantages of the 3827 include the need to download graphics-intensive resources at print time if they have not been cached, which ties up valuable communications bandwidth and slows throughput. The Xerox 4090 uses laser imaging and is based on Xerox's 1090 copier engine, which is coupled with a stand-alone controller unit housing a Digital Equipment Corp. J11 processor and hard-disk subsystems for resource storage. The 4090 attaches to an IBM host as an IBM 3211 protocol line printer. Xerox's approach to printer intelligence puts the processing power in the printer itself. With local hard-disk storage capacity of up to 1.1G bytes _ 370M bytes is standard _ the printer can store massive amounts of imaging resources such as scanned images, fonts and forms. Data files can also be buffered for printing multiple-document copies without data file retransmission from the host. By locally storing graphically intensive resources such as font bit-maps, throughput is enhanced at print time. The disadvantages of this approach are those typical of distributed processing in general. In a multiple-printer network, each printer must have a local copy on hard disk of each required resource. Managing multiple copies of resources is difficult. For example, when updating a stored form, the new version must be loaded to all the printers. Utilities that facilitate resource management also add overhead. Offering cut-sheet duplex printing, average monthly print volumes in excess of one million pages and optional postprocess paper handling, both machines are solid platforms for in-house publishing of high-quality documents. By David Hudson; Hudson is an analyst at CAP International, Inc.'s Intelligent Copier/Printer Market Requirements Service, which is located in Norwell, Mass. <<<>>> Title : Mixed-platform publishing Author : CW Staff Source : CW Comm FileName: drm Date : Jan 23, 1989 Text: Imagine for a moment that some of your text files reside on a personal computer network, some on a workstation and some on your mainframe. You also have graphics and data that you want to share among departments. File-transfer software allows you to move files from one environment to another, but how do you keep track of where each file is and when it was last updated? What you have been imagining is a mixed publishing environment, and it is the new challenge for systems managers. Mixed environments, especially those that involve networks, provide almost unlimited flexibility. That degree of freedom is a blessing in some ways for document resource management, but it can also be a curse. On the positive side, users can share resources produced by different departments, each of which uses a different computing system. End users can take advantage of the best features of each of the processor platforms while sharing the fruits of their labor across those platforms. For instance, host-based graphics tools are heavy resource consumers and lack many of the features of PC-based tools. Alternatively, host-based chart design facilities are usually more powerful than their micro-based counterparts. PC software typically offers better tools for improving the writing and readability of documents; comparable software is often difficult to find in the mainframe environment. Inexpensive proofing printers or high-quality laser printers are usually attached only to PCs, while high-speed production printers are typically attached to a mainframe. Managing document resources in this kind of environment requires very careful planning and monitoring in a variety of areas. Strategies must be mapped out for text creation, graphics, composition and target printers. File-transfer methods must be weighed and measured. Font compatability across all printers must be arranged, and a method must be found for tracking where all files are at all times. The most important part of managing the environment is understanding what types of files can be created and how portable they may be within a mixed setting. A manager must be acutely aware of which products are being used on all platforms, what text file formats are supported and what type of printer-ready output each can produce. Even products made by the same vendor are not necessarily compatible across all platforms. For example, if you try to connect PC environments using Aldus Corp.'s Pagemaker or Xerox Corp.'s Ventura Publisher to other platforms, you will find that the file formats for these products are not completely portable. It is possible to produce Adobe Systems, Inc. Postscript files from these products, transfer them to the mainframe, execute the Postscript to an IBM AFPDS Interpreter and print on a host-attached Advanced Function Printing printer. But this route would have to be repeated each time a source file changed. Exactly the same path must be taken, by the way, if you use Interleaf, Inc.'s Interleaf Publisher interface, which permits files to be transferred to the host. It is even more difficult to integrate host-based text or raw data into the PC page makeup environment. You could use a product like IBM's Markup, which is considered to be a text-entry tool, to create text files on the PC that can be transferred to the mainframe and composed using IBM's Document Composition Facility (DCF). However, working across platforms usually means using a text editor _ a host-based product or a micro-based one such as Mansfield Software Group, Inc.'s Kedit or Phaser Systems, Inc.'s Micro/SPF _ to create raw text files that may contain composition language markups or format instructions that one would see on a printer-ready file. If users create text files on a stand-alone word processor or a host-based office system product, the problem of compatibility becomes even more acute. These files will not be directly compatible with pure text files created with a text editor on either a PC or a host. IBM's Office Document Facility can be used to transfer such files to a host system in a DCF-compatible format, but the files that end up on the host use a primitive set of DCF Generalized Markup Language; if they are modified in any way, they cannot easily be sent back to the originating office system platform. If graphics are used within text files, there are just as many formats and just as many extra steps with which to contend. Furthermore, regardless of their original format, all graphics must finally be translated into either raster images, which are rectangular arrays of dots represented by strings of binary numbers, or into a page description language such as Postscript. If a raster image is required, it must not be translated when it is transferred between the host and the PC, so careful control of the file-transfer method is vital. It is also important to know if the originating software platform can translate the file back to the original form if changes are made in a different environment. Other questions are involved in managing the composition process in a mixed environment. You must know whether you maintain printer-ready files and, if so, the type of printer to be used and where it is attached. You must know whether all the printers use the same data stream. And, if not, you must decide whether to use software to convert the data stream or recompose for each type of printer. Then, too, there are the complications that arise because fonts are printer-specific and fonts available for one printer may not be available for use with another. Fonts and printers vary in their resolutions. When a document arrives at a printer, it has been transformed into a specific data stream with references to printer-specific fonts. The text has been set assuming a specific resolution. It is difficult to ensure that a document will look the same on both a host-based and a PC-based printer. Most IBM printers are 240 dot/ in. devices, while most other printers are 300 dot/in. These resolution differences make it impossible for documents to look identical even if the same fonts have been provided for both. The difference in resolution adds up to different line and page breaks. The advent of Postscript as a de facto standard for many printer vendors may alleviate this problem to a considerable degree because Postscript's fonts are resolution independent. That still leaves the manager with the question of whether all the printers throughout the environment are Postscript compatible and whether the specified fonts are available. File-transfer software is, of course, required in a mixed environment. Commercial software can be used, but if you have the time to create your own, it is possible to build in tracking functions that will prove helpful when files are moved from one platform to another. There is little in the way of automated solutions to manage text files created on more than one platform. Most people start out by creating their own file management methodology by using naming conventions to identify the home location of a given file. Programs can be added later to warn a person changing a file in one place that the same file exists elsewhere at a higher revision level. Alternatively, some managers settle on a library-based system, which requires that a file be checked out from and returned to a specific platform. The only certain fact in this kind of situation is that managers of mixed-document-composition environments are forced to rely on their own ingenuity. By Patricia McGrew and William McDaniel; McGrew and McDaniel are co-authors of In-House Publishing in a Mainframe Environment and On-line Text Management in an IBM Mainframe Environment, both published by McGraw-Hill. <<<>>> Title : Suggested procedures Author : CW Staff Source : CW Comm FileName: drmbox Date : Jan 23, 1989 Text: Recommended steps for making document resource management work in a mixed environment are as follows: Review technical requirements for text entry, graphics, composition, printing and file transfer before you try sharing files across platforms. Make sure that file formats are compatible across all environments. Provide tools for transferring files between platforms that minimize the need for user intervention. Appoint a controller to oversee hardware and software use and acquisition. Develop internal tracking of source files by where they reside, what documents use them and who controls them. Determine whether it is better to maintain print-ready files to permit demand printing or to recompose all files on an as-needed basis. <<<>>> Title : Intersection with image p Author : CW Staff Source : CW Comm FileName: pubin Date : Jan 23, 1989 Text: Woodrow H. Vandever Jr. is executive vice-president at Interconsult, Inc., a Cambridge, Mass.-based research and consulting company that specializes in new information technologies, including those associated with electronic publishing. Vandever recently spoke with Computerworld Senior Editor Joanne Kelleher about the ways in which he sees the technologies of electronic publishing and image processing converging. You say that there eventually will be a merging of electronic publishing technology and image processing technology. Why do you think this is inevitable? I think it has to occur, because without it, you are not going to be able to take full advantage of electronic publishing. This evolutionary step is necessary because of the volumes of information that electronic publishing will generate. As this information begins to expand, you run into immense problems involving the storage of electronic databases, the management of versions and revisions of documents and the distribution of information throughout the corporation. When you look at it that way, the need for immense archivable volume storage is a natural outgrowth of this process. You need large amounts of storage, and you need efficient software access mechanisms for image, text and numeric databases, and then you need a way to disperse it throughout the corporation. Can you clarify that connection? Right now, the vendors of image processing systems are responding mostly to the needs of a customer base that was previously buying microfilm, and they save the information in raster format vs. some kind of a logical format. They don't know that documents of the future are going to be done electronically, and hence, the source material is already in some kind of an object orientation, such as characters or vectors. When you stop to think about it though, these systems are nothing more than large, archivable electronic storage devices for documents that the corporation or government agency uses. How do you see this affecting either the scope or the content of corporate electronic publishing? I think it affects the business more than it actually changes corporate publishing. It is a mutual rather than causal relationship. The major impacts will be that you will be able to optimize the use of information _ making it available to people in a way that is as easy as electronic mail. Is this important because corporate publishing is becoming a more strategic undertaking? Absolutely. In addition to accessibility, what you are talking about is turnaround time. You can interact with your customer more rapidly. You can get your document out. If you are a manufacturing corporation, one reason that corporate electronic publishing looks so nice is not just the hard cost savings but the soft cost savings. For example, most manufactured goods cannot be shipped without the documentation. But in all cases, the documents are always the last entity in the process because usually, the product keeps getting modified and changed until it is finally produced. Image processing systems are not set up right now to accommodate the kind of varied input and widespread exchange that you are talking about. What would have to happen in order to make them effective vehicles for corporatewide archivable electronic storage? First of all, there has to be a recognition that image processing is no longer a unique entity. Image processing is really an outgrowth of automating microfilm/ microfiche, which were optomechanical means for reducing the volume of paper that people needed to save while still keeping the information it contained accessible. When image processing came along, people realized that they could scan information in and keep it electronically, ship it around the system and display it electronically. But they are still treating it as if it were hard copy and the best compaction you can get, over the number of bits that you have to save per page, is probably around 80. And most people don't get anywhere near that. But effective compression of information would be a must to deal with the kind of volume you are talking about. How can image processing systems be adapted to accommodate a larger volume? One of the problems with image processing is the sheer volume of data that you have to carry around and store. If you have to save little rasters, you wind up, at resolution of 300 by 300 dot/in., with 9 million bits or roughly one megabyte of information for an 8 - by 11-in. page. People use different compaction schemes, but they tend to wind up having limits of about an 80-to-1 ratio for compression. And that still means 12,000 bytes for a single page. If, as information becomes electronic, you can save the things in a logical format, then you have a higher data compaction, you effectively have object recognition occurring. And that is invariant, small, compressed _ and you can save a lot more of it. On the output side, for example, the industry has pretty much settled on Postscript as the de facto output media in the non-IBM mainframe world. And Postscript can actually be thought of as a compaction scheme because it not only contains the document format and textual information but also effectively contains large numbers of the images in kind of a vector or programming format. Image processing systems need to be able to take advantage of that kind of efficiency, capturing the large volumes of newly generated documents in either the output or interchange formats where they reside, because these formats are much more efficient in terms of compression. It is true that some image processing systems use CCITT Level IV, which is a very compressed format, but a lot have their own special formats. And that is a problem, not only in terms of compression but also compatibility, right? True. As interchange standards such as ODA/ODF [Office Document Architecture/Office Document Facility], SGML [Specialized General Markup Language] and CDA [Digital Equipment Corp.'s Compound Document Architecture] become available, then large numbers of documents in the system will want to be able to live in the interchange format, so that the information is reusable _ can be modified, changed, grabbed in sections, and so on. When that happens, people will also want be able to disseminate that information throughout their companies, across the network, and use the standard terminals on their desk to see it. And that means image processing systems are going to have to become compatible with all the standards that drive the process. In the future, the only thing specialized about image processing will be the software-based capability to control, manipulate and manage large-volume text, image and numeric databases in an efficient fashion. The rest of the components will be standard. There will be some big database management problems that will have to be tackled, but image processing won't be the specialized problem domain that it has been in the past. What kind of database management problems are you talking about? The size of the textual streams and the size of the image streams are so big that you can't effectively get things back. It's a new game, and we'll have to develop a whole new set of paradigms to deal with it. This sounds like something that is logical and economical only at a certain level of need. I agree with you, but that is today. Tomorrow, I look at it differently. The reason I say that is that tomorrow, as information basically becomes electronic, then the need for high-volume archivable storage just becomes an automatic offshoot. I'm going to have these electronic databases in most corporations, whether I like it or not, just by the fact that I am going to do corporate publishing. So in the future, it is a question of volume. And the neat thing is when we do go this route, then the actual cost goes down because you won't have all the special hardware for format translation. You can use your network throughout your company and have things accessible. Are we talking about significant savings here? Enough to make cost a deciding factor? Well, I guess I think the dynamics are a little different. I think there will be cost savings, but the cost savings get down to the fact that volume storage is going to get cheaper and cheaper. Still, I can't imagine that every corporation would really have a need for such massive electronic document archives. Actually, that is true. There may be a class of people for whom that will not be necessary. There are several scanners capable of doing this now and a lot more in development. When would you expect that we might see some real merger between the two technologies? It is hard to say exactly when that will occur. It could occur tomorrow morning or in four or five years. All that I can say for sure is that it will occur. My guess, however, is that it will evolve, rather than happening suddenly and dramatically. Why do you say that? I've talked to a number of the people who are players in the image processing market, and I'm not sure they recognize the potential. Is the situation similar among corporate electronic publishing vendors? To a certain extent, yes. You have to separate the people who are selling composition and pagination systems from larger corporations such as DEC, IBM and Xerox. For the smaller ones, it is not so much a lack of vision as a lack of resources. When you go to IBM and DEC, on the other hand, or Sun and Apple, you run into a different situation. I believe they do see this as the way of the future. Do you think the real catalyst for what we've been talking about is going to be the read-write CD-ROM? Absolutely. That kind of thing will make it all feasible. <<<>>> Title : Ask the vendor Author : CW Staff Source : CW Comm FileName: askpub1 Date : Jan 23, 1989 Text: We use IBM's Solutionpac on an IBM Personal System/2 Model 70 with the IBM 4216 printer. It only allows us to run one PC into each printer. Does IBM have plans for an adapter that will allow us to run more PCs into the 4216 printer? Paul Olenski Director of MIS Chicago Transit Authority IBM: The 4216 laser printer may now be shared in a local-area network. This month, IBM will unveil a maintenance release of the IBM Personal Pageprinter Adapter Program, which will allow printer sharing by multiple PS/2s, Personal Computer ATs and XT 286s. <<<>>> Title : Pulling down the grapevin Author : Alan J. Ryan Source : CW Comm FileName: mgtcolum Date : Jan 23, 1989 Text: The California Raisins have become famous singing about what they've heard through the grapevine _ namely, that raisins grown in the California sunshine are better than any others. Those cute, animated clay figures singing and snapping their tiny gloved fingers are the epitome of television advertising. They bespeak sincerity. They charm children into demanding ``Raisins from California only, Mom.'' But picture this. Bring those plump raisins into the MIS department of Joe's Huge Nationwide Claymation Advertising Co. and give them all the information they need to know about doing their jobs. But neglect to pass along any information about important corporate issues that could affect them. Soon, the uninformed raisins will be considered little, wrinkled, dried-out grape gossipmongers, because as they start to hear tidbits of information, it is likely they will in turn pass it along. Tales of mergers, takeovers, promotions, demotions, new hires, layoffs, profits, losses, percentage raises, safety concerns and the like will be on the lips of every raisin venturing to the water cooler. Now, put the faces of your subordinates and coworkers on those raisins, and you can get an idea of how the lack of information can be more damaging than telling employees some news that might adversely or positively impact them in the future. Knowing that the uninformed can do a lot of damage through the rumor mill, especially when the rumors are unfounded, many companies are now fighting back. Negative rumors are like negative advertising for the company. At MCI Communications Corp. in Washington, D.C., for example, the firm tells of an ongoing active effort to shower the firm with accurate information straight from the top. Each Monday, executives meet to discuss the business, and that information is then passed along to the company's employees. According to Bill McGowan, founder of MCI, the executive-level breakfast meetings are the place where rumors are stopped before they can be started. The managers, in turn, are told to spread the word to the masses. Does it really work? McGowan says yes. ``When people are aware of what is happening elsewhere in the company, they can better make decisions in their own areas,'' he says. J. Raymond Caron, president of Cigna in Philadelphia, agrees. It is important to let employees know where the company is headed, he says, and how they fit into its direction: ``We also have to let them know when we are not there and why we are not there. But it works both ways. We need to allow for the employees to let us know what works and what isn't working.'' This certainly means more than sharing with employees the fact that the boss will be retiring in 1992. Strategic information should be shared. Employees will most likely perform better when they know what goals they are supposed to be striving toward. At the same time, care should be taken so that highly confidential information does not leak out. By being straightforward, a manager will gain trust and respect from his employees. Ignorance is rarely blissful in business; more often than not it will lead to ambivalence from the workers. By Alan J. Ryan; Ryan is a Computerworld senior writer. <<<>>> Title : MIS takes the field Author : CW Staff Source : CW Comm FileName: meadow2 Date : Jan 23, 1989 Text: In the mid-1970s, the outlandish idea of filling in a New Jersey salt marsh with sand dredged from New York Harbor and thereupon building a football stadium became reality. Giants Stadium at New Jersey's Meadowlands was a bold declaration of that state's independence from its larger neighbor, New York. Over the years, the complex grew by adding the Meadowlands race track and the indoor Brendan Byrne Arena. In contrast to the grand scale of the buildings, information management at the Meadowlands was at first an ad hoc affair; organization was sparse, and paper ruled. But the need to gain control over strategic information was irresistible. ``We did so many things manually, people realized there had to be a better way,'' says Gerry Connolly, MIS director at the New Jersey Sports and Exposition Authority, which operates the Meadowlands in East Rutherford, N.J., Monmouth Park Racetrack and an aquarium under construction in Camden, N.J. Two years ago, Connolly, a 16-year MIS veteran, came to the Meadowlands to bring order to its MIS shop. Just as the complex was a pioneering effort for New Jersey, so the 43-year-old Connolly had to carve out an efficient MIS operation. The tools he inherited were a recently purchased Unisys Corp. Mapper 10 system and a handful of personal computers. Calculating the costs His mandate is to improve the Meadowlands' profitability by making information on the costs of running events available to management. Connolly feeds data on receipts and expenses to his boss, Chief Financial Officer James Durkin. ``I want efficient daily and monthly information,'' says Durkin, who sees that the authority, as landlord of the complex, turns a profit. In 1987, the authority earned $46.5 million. The MIS shop's responsibility does not extend to tallying wagers at the racetrack, which is done by a service bureau, and the finances of the Meadowlands' sports teams _ the New York Giants, New York Jets, New Jersey Nets and New Jersey Devils. These tenants are responsible for their own computing. Carrying out his assignment, Connolly proceeded to build on what was available. He used Mapper, Unisys' fourth-generation language, to develop applications quickly and added an infusion of PCs to make office employees more productive. With an MIS staff of four, Connolly hired consultants to create approximately 10 programs in Mapper. The PC population grew from four to 40 in a year, and another 40 PCs are slated to be added in the next two years. Connolly's staff, meanwhile, doubled from four to eight. The MIS department at the Meadowlands is housed in the authority's corporate offices on the ground level of Giants Stadium. Connolly, who had grown accustomed to a buttoned-down atmosphere in his prior MIS stints at Pepsico, Inc. and McGraw-Hill, Inc., is fascinated by the running of the facility that surrounds him and enjoys taking strolls around the stadium during breaks. However, he must be on guard for football players who may suddenly appear out of nowhere as they jog in the bowels of the stadium. It's not the typical 9-to-5 environment. The Meadowlands' Unisys system was likewise new to the MIS director. Connolly, who began his career as a Cobol programmer, had always worked in IBM shops. But he credits the Mapper language with allowing him to bring up applications quickly and avoid a backlog. Brian Gorman, systems administrator and assistant to Connolly, has also become a Mapper fan. ``If you understand the application, you can program,'' he says. Gorman, who worked as a security guard at the complex as a student, was present for the opening of all three facilities. On the PC front, Connolly made the decision to standardize on Ashton-Tate Corp.'s Multimate for word processing and Lotus Development Corp.'s 1-2-3 for spreadsheet work running on Microsoft Corp.'s MS-DOS. A typical activity is to perform what-if scenarios, plugging figures into spreadsheets to get cost estimates for events. The shop has also standardized on IBM-compatible PCs, primarily machines using the Intel Corp. 80286 microprocessor. The shop buys a number of different clones, but mainly NEC Corp. and Compaq Computer Corp. processors. Some of the PCs equipped with terminal emulator boards are linked to the Mapper 10 host, from which they download corporate data for spreadsheet manipulation. Although balancing income against operating expenses can be relatively simple for some events such as the routine Giants sellouts, the process can be complicated for others. The annual Meadowlands Grand Prix Auto Race was once a record-keeping nightmare. The race course, which winds through the Meadowlands parking lot, must be laid out and lined with temporary barriers and grandstands. The cost of removing and replacing the barriers is significant. ``You couldn't tell when you spent money. Now, on the computer, you can look at the information in detail or globally,'' Connolly says. Another ongoing project is to use information systems to improve the racetrack's operations. A number of variables can affect racing attendance and betting take. For example, a snowstorm can cut attendance sharply. But if management knows how many hard-core racing enthusiasts are likely to brave foul weather, they can make sure the right amount of staff is on hand at the track. In addition to keeping up with current goings-on, Connolly must look ahead to prospective events. Next year, Giants Stadium will host the Army-Navy football game, and the MIS department is busy figuring out its costs and revenue. Further down the road, professional baseball could come to the Meadowlands, even though a recent proposal to build a baseball stadium was turned down by New Jersey voters. The idea is not expected to die, however, and Connolly, an avid baseball fan, looks forward to the day when New Jersey, which only a few years ago hosted no major league sports, adds a baseball team to its roster. By Stanley Gibson, CW staff <<<>>> Title : Adventures in tech implem Author : CW Staff Source : CW Comm FileName: getstart Date : Jan 23, 1989 Text: CAMBRIDGE, Mass. _ It all looks good on paper. But when the time comes to actually implement CASE tools, executive information systems and expert systems, the best advice those who have been through it can offer is to keep it simple, do not get in over your head and enlist the help of users. When Frank Morelli, associate director of expert systems at Colgate-Palmolive Co. in New York, was called upon to evaluate knowledge-based systems (KBS), he formed a group of users to discuss which areas of the company would be suited to such a system. Then he built his test system around an actual in-house problem using a personal computer-based expert system shell. At one time, the MIS group at Fidelity Investments in Boston was little more than a maintenance shop, said Claire McGhee, who was formerly the director of systems planning there. When the group decided to start developing computer-aided software engineering (CASE) applications, they found the greatest success when they placed a high value on people and set reasonable goals rather than lofty ones. And when a president at The New England, a Boston-based insurance and financial services firm, decided he needed help wading through the scads of reports that crossed his desk, he talked to the systems people. After a couple of failed mainframe-based program attempts, he finally got what he had asked for through a relatively simple PC executive information system (EIS) designed with the help of users, Vice-President Vince Ficcaglia said. That system has since made its way onto many other desks at The New England. People are solving real problems using technologies that many firms still shy away from. With the support of management and careful legwork, CASE, KBS and EIS can make important inroads, those exploring the technologies told attendees at last month's annual Conference on Strategic Issues in Managing Information Technology, which is sponsored by Decision Support Technology, Inc. Insider insight To have a successful development effort, a relatively stable staff is important, McGhee said: ``If you have high attrition, the training investment does not make much sense.'' A stable development environment also consists of two-way communications between management and staff, high quality and stable physical facilities, she said. Most developers are tempted to jump feet-first into a project once they have been given the go-ahead. Instead, it is important to carefully define what the end results should be. ``Rather than have technology drive the solution, it was just a part. We defined the need, and technology contributed to the solution,'' Ficcaglia said. Next, be certain that all of the necessary tools and training are in place before starting. Many companies make the mistake of taking the ``fire, ready, aim'' approach, trying to implement too many changes too fast and mistaking tools and technology for solutions, McGhee said. Others run into trouble when they implement systems that are out of synch with the company culture. Do your homework, she urged. Part of that homework should involve talking to users to see what they might expect from a new system and to inform them of a system's limitations, Colgate's Morelli said. The homework should also include a visit to a noncompetitor working on a similar project, the speakers said. And when you finally begin, start off small, if possible, and be sure to establish relevance; do not develop an expert system to solve a problem that could be effectively addressed with a spreadsheet. It is also important to have a core of users supporting the efforts under way. ``If you don't have a core of user champions, it is going to fail,'' Morelli said. The users' opinions are critical not only for creating a successful system but also for helping with the final product's acceptance. To keep this continued support, be willing to show results at various phases of the project and do everything you can to have the project completed on time, the speakers advised. A typical mistake that many companies make is postponing projects in the hope that the technology will be more fine-tuned and prices will drop soon, said David Ness, a professor at The Wharton School at the University of Pennsylvania and director of systems development at TV Guide. While that approach is prudent in some instances, it can become habitual, he said. And the company that follows this practice will be missing out on new developments that, while not perfect, can achieve favorable results. The speakers cautioned that projects such as EIS that are user-driven can cause friction with the traditional data processing group. ``Don't treat new technologies separate from traditional systems. They are just new tools in the information systems toolbox,'' Morelli advised. Another important lesson, Ficcaglia added, is to be realistic. No system is a long-term answer. ``Executive information systems have to be designed to expect and accept change,'' he said. By Alan J. Ryan, CW staff <<<>>> Title : Army's ISC may head East Author : CW Staff Source : CW Comm FileName: army Date : Jan 23, 1989 Text: The management of one of the world's largest MIS organizations will be shifting from Arizona to Massachusetts if the U.S. Congress approves a federal commission's proposals for the reorganization of U.S. military bases. The U.S. Army Information Systems Command (ISC), headed by Lt. Gen. Thurman D. Rodgers, is slated to shift from Fort Huachuca, Ariz., to Fort Devens, Mass. The overall plan is to reorganize or close 145 military facilities by 1995. The ISC oversees systems development and operations, communications, printing, television facilities and records management at Army sites throughout the world and employs 42,000 people. Under the plan, 1,156 military personnel and 2,784 civilian jobs will shift to Fort Devens. Personnel will come from Arizona and the consolidation of three ISC organizations now headquartered at the Information Software Center in Fort Belvoir, Va., the Information Systems Management Activity in Fort Monmouth, N.J., and the Software Development Center at Fort McPherson, Ga. The Commission on Base Realignment and Closures offered few reasons for its suggestions. The panel said Fort Devens is well suited for a national command, which led to speculation that the officials want the command sited closer to Washington, D.C., and to the pool of computer-oriented talent in Massachusetts. <<<>>> Title : Ups and downs of job hopp Author : CW Staff Source : CW Comm FileName: career16 Date : Jan 23, 1989 Text: To programmers, changing jobs every year or two has long been a simple solution to the question of how to raise both salary and skill level, at least during the first five years of a career. The chronic shortage of experienced technical personnel is part of the picture. Another one is that a manager hiring a new employee can usually offer a more attractive salary than can the employee's current manager, who likely is forced to remain within corporate raise guidelines. Such factors have helped ensure that the technical employee can usually boost his pay 10% to 20% with each job change. Furthermore, an MIS professional's long-term value to employers rises with his exposure to a variety of software and systems, and changing jobs often equips workers with the skills they need to continue pursuing profitable careers. The downside But while job hopping may benefit the professional and his new employer, it can also be an ongoing headache for MIS managers. Valued employees often leave for better prospects just when they have become most indispensable on their current projects. For this reason, managers sometimes look askance at job candidates with a record of frequent changes. Don Gloistein, a former department head at Alvin Community Hospital in Alvin, Texas, says a pattern of jobs lasting three to six months is probably the most disturbing, suggesting that the employee has failed to get through the probationary period. According to Mark Jacobs, a senior consultant at Data Pros, an East Hartford, Conn., recruiting firm, most employers prefer to see an employee who has put in three years or more at a company. But, Jacobs says, the major factor determining a manager's attitude toward frequent job changers is the career pattern prevalent among current employees. At a company at which most managers are long-term employees _ many coming to the firm straight out of college _ job hoppers have a much slimmer chance of being considered than they do at companies employing managers who built their careers by making well-timed changes. Edward Wisnewski, vice- president of staffing at Bank of America Systems Engineering in San Francisco, says that in his organization, he finds wide differences in how job hopping is defined, with the determining factor more often than not being the career pattern of the individual doing the hiring. Managers who have made their career with one or two companies tend to look less favorably at frequent job changers than do managers with more varied histories. But longevity in a previous job does not guarantee that an employee will be satisfactory. ``I got burned by solid resumes too,'' reports Gloistein, who says some hires had stayed on in previous positions only because they ``weren't bad enough to get fired.'' Jim Wetzel, a general supervisor in the information systems department of Baltimore Gas & Electric Co., claims that for him, the key to screening out candidates with this kind of ``stability'' is to make certain there has been a steady progression in responsibility at the applicant's previous job. Role playing Another factor that weighs heavily in hiring is the employee's role. Jacobs and Wisnewski agree that if a company is trying to find a technical specialist with experience using a hot new technology such as IBM's DB2 or automated teller machine networking, it may be more inclined to make allowances for a busy resume if the employee can demonstrate solid experience using the new technology. Another situation that can influence the decision is the hiring manager's familiarity with the applicant's previous employers. If the companies are known to have serious, ongoing turnover problems, poor pay or a reputation for promising would-be employees more than they can deliver, managers tend to look with less suspicion on employees who have left jobs after a short time. A manager's decision to hire an attractive, obviously capable prospect who has a history of frequent job changes also has a lot to do with the manager's feelings about how well his own company treats its employees. The manager who knows his company will not let him lose a good employee over a few dollars an hour is more likely to approve a hire in a borderline case. Hopping helps consultants Interestingly, the job hopping that works against the technical person applying for a regular job may work for him if he is being considered as a consultant. When evaluating consultants, who are usually only expected to work for a short time on a project, clients are interested almost exclusively in the applicant's skills _ something frequent job changes can expand. The clients do not have to worry about spending their training budget on the consultant, and if a consultant turns out not to have the skills he claims, the contract can be immediately terminated. Tom Rawson, an independent consultant and software developer at J. P. Software in Arlington, Mass., points out that for consultants, a history of short-term assignments can be a good record, because it shows that ``they did what they contracted with the client to do.'' Furthermore, once a programmer has done good work as a consultant and shown that he fits into the department, the client might hire him as an employee should the opportunity arise, even if his resume had ``too many'' jobs. By Janet Ruhl; Ruhl is a consultant programmer in Windsor, Conn. <<<>>> Title : Telex used values remain Author : CW Staff Source : CW Comm FileName: market16 Date : Jan 23, 1989 Text: A little over a year ago, Memorex International N.V. and Telex Corp. agreed to merge, forming the world's largest plug-compatible manufacturer. On June 3, the combined Memorex Telex N.V. announced the 1191, 1192, 1091, 1092 display station families. These terminals have been positioned against IBM's 3191 and 3192 terminal offerings, replacing older 3191- and 3192-compatible Memorex and Telex displays. The most recent Telex terminal to appear on the used marketplace is the Telex 191. This 12-in. entry-level monochrome display, which was first introduced by Telex in January 1987, has been withdrawn from the market and has since been replaced by the Memorex Telex 1191 family. The Telex terminal is currently trading for 62% ($900) retail. The Telex 078 Model 1, which was replaced by the Telex 191, is currently trading for 26% ($400) retail on the used market. The 078 Model 1 was first introduced in August 1984 and was positioned to compete with IBM's 3178 monochrome terminal family. End users have been migrating from this older terminal to the Telex 191 on the used market or to the newer 1191 from Memorex Telex. The 079 and 179 Telex color terminals remain in short supply on the used market. In June 1988, Memorex Telex replaced the Telex 079 and 179 with the 1092 C and 1192 C, respectively. End users have been moving gradually from the popular 079 and 179 to the new replacement boxes. Thus, used values have continued to hold for these color terminals. The 079 Model 1 is an entry-level 12-in. color terminal that is currently trading for 42% ($800) retail. The 179 Models 2, 3 and 4 are enhanced 14-in. displays, which were announced to compete against IBM's 3179 Model 1. The 179 Models 2, 3 and 4 are currently trading for 47% ($900) retail. In comparison with IBM, few used Telex terminals trade on the secondary market. It is for this reason that a stable end-user take-out market does not exist. Very few dealers are willing to take Telex terminals into their inventory, causing the end-user take-out market to be a ``whatever you can get for it'' situation. Dealers are painfully aware that they may have to sit on one of these terminals for a while before finding a buyer. Therefore, they are offering end users low take-out values as a hedge against future used market declines. For more information, contact IDC Financial Services Corp.'s Terri LeBlanc at 508-872-8200. By Lucinda Santisario, IDC Financial Services Corp. <<<>>> Title : Unraveling SQL for MIS pr Author : CW Staff Source : CW Comm FileName: train16 Date : Jan 23, 1989 Text: Sequential query language _ known as SQL _ is a database access language that is emerging as the industry standard for relational database management systems. It presents a unique challenge to data processing trainers for several reasons. First, relational DBMSs are new to the production world in most organizations. Requests for training in IBM's DB2 and other systems are just arriving. Second, there is an implied contradiction in marketing hype surrounding these products, such as the claim ``simple, yet powerful.'' These contentions raise questions such as, How simple? Simple for whom? Simple without training or because of it? Third, SQL is new. Its role in production applications is not well defined in many shops. Is SQL a programmers' language? An end users' language? If it is the latter, how far do we take end-user training? Finally, SQL exists in many dialects among different vendors' DBMSs. If you use Relational Technology, Inc.'s Ingres on personal computers, Oracle Corp.'s Oracle on Digital Equipment Corp.'s VAX machines and DB2 on the mainframe, which SQL do you teach? Fortunately, unless DP trainers are directly involved in SQL training, there is no need for them to become experts in the technology. They will, however, need to be conversant in the following four areas: Relational terms and concepts. The near and long-term future of DP belongs to relational and distributed relational systems and products. You must be able to identify and describe the basic components and critical issues of these new technologies. SQL. The common data access language for relational and distributed relational systems will be SQL. You should understand SQL at least to the extent that you currently know Cobol. You should participate in an SQL class or computer-based training course to gain a first-hand understanding of the technology's strengths, which include set-level processing, English-like syntax, a short initial learning period, data independence and simple data structures. Don't make any long-range SQL training decisions until you have met SQL in person. Relational database and application design. The system development life cycle is being changed in fundamental ways by extended relational analysis, prototyping tools and techniques and modern database and application design methodologies. These changes will alter your company's training needs dramatically. However, DP trainers, except for ones teaching database or application design, need understand only what these methodologies are (at a high level); how the organization is going to employ them; and who needs to master them, as well as the training alternatives, which are usually limited. Decision support products, computer-aided software engineering tools and fourth-generation language application generators. The technologies in this category, aimed at improving productivity, vary greatly. Again, trainers in general need only understand what they are, how their organization is going to employ them and what training alternatives are available. Everybody thinks so If all this learning seems like a lot of work, take heart _ your sentiments are echoed throughout the organization. Moving to new technologies is work for all affected areas of a company. The phrase ``gearing up'' for DB2, Oracle or Ingres means taking the time to get retrained in new procedures, concepts and approaches _ not fearing radical new means of going about your day-to-day business. DP trainers must be on the leading edge of implementing these new technologies. Without the support of trainers, many of the financial gains made possible by productivity-enhancing tools and services can be lost. Poor training, misdirected training or _ worst of all _ lack of training in these technologies will lead to ineffective use by applications personnel. The decision to go to a relational database does not simply involve purchasing software and a few more direct-access storage devices; it means a restructuring of job specifications as well as systems design and development methodologies, not to mention a hefty amount of training. These repercussions are the reason DP trainers must become conversant in the relevant areas. Without their advice and help, corporate efforts to implement new technologies such as SQL can end up costing more than necessary and perhaps even failing. Training is not a peripheral or secondary support activity; it is an intrinsic part of the successful deployment of SQL or any new strategic corporate technology. By Jonathan Sayles, Special to CW; Sayles is director of educational services at The Systems Group, Inc. in Glastonbury, Conn. <<<>>> Title : Organizing to meet the sy Author : CW Staff Source : CW Comm FileName: stevens1 Date : Jan 23, 1989 Text: Factory automation can be a great thing, but when the organization is not there to support it, it doesn't do you much good. That is what Raychem, Inc.'s Metals Division, a maker of metal alloy couplings for the airline and marine industries in Menlo Park, Calif., discovered when its growth began to outstrip its manufacturing resource planning (MRP) system's ability to function. Raychem, which makes very high-precision products, has experienced a rapid increase in the number of its parts and processes in the last 10 years. To maintain control, it has developed multiple levels of computerized and manual tracking and statusing procedures. But while the division has been able to keep on top of the situation, cracks were showing in the foundation. Statusing was slow and sometimes inaccurate, job changes were difficult to implement and record, and a growing number of man-hours were being spent maintaining the tracking system. So last year, the company called in Andersen Consulting, a division of Arthur Andersen & Co., to help simplify the complex organizational infrastructure that was threatening to seriously hinder further growth of the division. ``An MRP system can't fix a problem in the factory; it can only make a good factory run better,'' said David Taft, Raychem's vice-president of manufacturing. ``When we became the sole supplier to a number of companies, we'd do anything to accommodate them. As a result, we found ourselves going from a development stage to a manufacturing stage without having done much process design.'' Parts and more parts The biggest problem for Raychem was the meteoric increase in parts numbers that resulted from the demand for custom parts _ each of which was assigned its own number _ and the practice of assigning a new part number to each subassembly. As a result, when the product line grew to 10,000, the division's database had to handle 40,000 parts numbers. According to Mike Rowan, the Andersen consultant who is working with Raychem, if the present system continued for five more years, the division would be bloated with more than 100,000 parts numbers. Rowan said Raychem's numbering system was effective in doing what it set out to do but added that the side effects were disastrous. ``They did indeed significantly enhance their ability to make the right product according to customer specification. But they did it at the cost of dedicating 40 to 50 people in a division of 400 to maintaining the system. The high labor cost was beginning to have an impact on their ability to compete on the basis of cost,'' he said. Raychem wanted to reduce the number of parts the division needed to track. The solution, which was developed jointly by Raychem engineers and Andersen consultants, was to change the plant design from a process to a product orientation. All machines that were needed to work on one part from start to finish were relocated in one area in a horseshoe configuration. The results were dramatic. Taft explained that in the old system, the six machines that were used to manufacture an average part were often located in three different buildings. Between each machine operation, there would be a quality control inspection, the part would be placed in inventory, a new work order would be issued and so on. Each of these steps required entering data into the computer and often meant generating a new parts number. Now that the six machines are in the same place, the part that comes out of one machine goes right to the next. This greatly reduces the need for large inventories, subassemblies or work-in-progress areas. This procedure whittled the number of steps needed to create an average part from 120 to 90 and reduced the distance the part had to travel from 2,500 to 50 feet. But most important for Taft, it meant that the only data collected for each part was a single work order that is tied to one parts number. A second simplification is the elimination of what Taft calls double-entry quality control. He explained that when there is a three-month wait between two operations on a part, the tendency is to inspect the part after the first operation and again just before the second one. With the new system, however, the time between operations is typically less than one minute, so one inspection per operation is enough. The new organizational structure at the division is also allowing it to make better use of an MRP system. ``When a factory is encumbered with so many parts numbers, the MRP system often becomes a tool for following orders through the factory when it's really supposed to be for strategic planning,'' Taft said. Taft contended that when bills of materials are six or seven levels deep, the MRP system becomes burdened beyond its capacity and is rendered incapable of doing any real planning. While Raychem has not yet been able to calculate dollar savings, there have been measurable results. Taft said that throughput time has been reduced by almost 99%, work-in-progress inventories have been reduced by 80% and setup time has been reduced by 70%. The division is also rapidly moving toward its goal of a fivefold decrease in the number of parts and a fourfold decrease in staff dedicated to data entry. By Larry Stevens, Special to CW; Stevens is a free-lance writer based in Springfield, Mass. <<<>>> Title : New products Author : CW Staff Source : CW Comm FileName: manufact Date : Jan 23, 1989 Text: Manufacturing Solutions and Systems has added full Help screens to the IBM System/36 version of its Manufacturing and Control System. The software covers manufacturing resource planning, full inventory control, product costing, shop floor control and purchasing applications. The system is expected to be available for the IBM Application System/400 in native mode in the second quarter. The Manufacturing Control System for the IBM System/36 costs $30,000 and includes education, phone support and maintenance. Manufacturing Solutions, Suite 156, Atrium Building, 120 Bishops Way, Brookfield, Wis. 53005. 414-786-1172. Scientific Computer Associates (SCA) has released the SCA Statistical System, which includes a Quality Improvement package developed specifically for manufacturing engineers. The program provides tools for the design and analysis of experiments used in statistical methods. Features include response surface analysis and analysis of multifactor and multilevel designs. The software is available in versions for Sun Microsystems, Inc., Hewlett-Packard Co., Digital Equipment Corp. and Apollo Computer, Inc. platforms. Commercial industries can license the SCA Statistical System at a cost of $30,000 for the initial year. Packages may be licensed separately or in combination. SCA, Suite 106, Lincoln Center, 4513 Lincoln Ave., Lisle, Ill. 60532. 312-960-1698. A numerical control file graphic and text editor is now available from Consultek Software Systems, Inc. Targeted at small to medium-size personal computer board fabricators, the SU-08 permits the reading of most numerical control files and then displays the files in text or graphics format for editing, the vendor said. The SU-08 costs $899. Consultek, Suite C-25/26, 1400 Coleman Ave., Santa Clara, Calif. 95050. 408-988-8091. Ithaca Software, Inc. has announced that its three-dimensional graphics tool system Hoops now supports Apollo Computer, Inc., Hewlett-Packard Co. and Silicon Graphics, Inc. workstations. According to the company, the system was designed to simplify the development of 2-D and 3-D technical applications and is based on a hierarchical object-oriented graphics database. The software can also be ported to Digital Equipment Corp., Apple Computer, Inc. and Intel Corp. 80286- and 80386-based environments. The latest release is scheduled for delivery in March and is priced at $3,450. Ithaca Software, 902 W. Seneca St., Ithaca, N.Y. 14850. 607-273-3690. International Bar Code Systems, Inc. (IBC) has introduced tool crib management software developed for Digital Equipment Corp. VAX/VMS systems. BCScrib maintains a record of the issue and return of every tool and tracks both durables and expendables. BCScrib is priced from $8,000 to $40,000, depending on system configuration. A version is also available for the IBM Personal Computer XT or AT. IBC, Suite 121, 265 W. High St., E. Hampton, Conn. 06424. 203-267-6651. Intecolor has introduced a personal computer workstation optimized for industrial control environments. The IPT 2000 is an IBM Personal Computer AT-compatible product that reportedly features a 20-in. IBM Video Graphics Array display, an Intel Corp. 80286 processor and a 20M-byte hard disk. The IPT 2000 costs $7,995 in single quantities, with discounts available for volume purchases. Intecolor, 2150 Boggs Road, Duluth, Ga. 30136. 404-623-9145. A series of personal computer-based data acquisition and control products is now available from Heath/Zenith Computer Based Instruments and Burr-Brown Corp. The modular board, panel and software components were designed to increase data analysis capabilities in most manufacturing and processing business. The Heath/Zenith Data Acquisition and Control System is said to be especially suited for use in industries that require the control and monitoring of sound, temperature, strain, velocity and displacement. Pricing is dependent on configuration. Heath/Zenith, Hilltop Road, St. Joseph, Mich. 49085. 800-331-0277. <<<>>> Title : Not so fast... Author : Larry Buerk Source : CW Comm FileName: buerklet Date : Jan 23, 1989 Text: Regarding your article saying that image transmission times have been reduced from 10 seconds to two [CW, Nov. 14]: It takes the improved Photophone two seconds to retrieve an image from a diskette and display it on the screen, not to transmit the image. Transmission time ranges from about 12 to 25 seconds, depending on the resolution mode selected and the content of the image itself. Larry Buerk Director, Corporate Communications Image Data Corp. San Antonio <<<>>> Title : Both sides now Author : Charley B. Cross Source : CW Comm FileName: crosslet Date : Jan 23, 1989 Text: Janet Ruhl's article ``Now you see it . . .'' [CW, Nov. 21] struck a chord I feel quite strongly about: honesty in hiring. It is worth mentioning that there are those of us on the hiring side who believe that full and fair discussion of both the job and the applicant's qualifications is right, both ethically and in terms of the long-term benefits derived by the company and the employee. In my department at Wang, managers and peers interview candidates. This gives us a broader view of the candidate, allowing us to make a more considered decision. Just as important, it gives the candidate the opportunity to really see the environment in which he or she may soon be working. In an interview, I always turn it around and invite the candidate to interview me about the job, the group or Wang in general, and I answer as accurately as I can. I believe that honesty on our part encourages honesty in our candidates. Then if both sides decide to go ahead, we get an employee who is more likely to enjoy the job, be productive and stay with us. And the employee is more likely to land a job that solidly meets his or her career objectives. Charley B. Cross Manager, Investment Systems Development Wang Laboratories, Inc. Lowell, Mass. <<<>>> Title : AS/400 serves minibar mak Author : CW Staff Source : CW Comm FileName: dometic6 Date : Jan 23, 1989 Text: LAGRANGE, Ind. _ A month-old IBM Application System/400 Model 60 has put MIS in the catbird seat at Dometic Corp., a manufacturer of appliances for recreational vehicles. The 48M-byte machine allows one MIS group to run factories in three states and supports 150 users nationwide. The data center of the $225 million Dometic, based in Elkhart, Ind., is located here. From this point, 14.4K bit/sec. long-distance telecommunications lines reach out to factories in Elkhart and Evansville, Ind., to Sarasota, Fla., and Santa Ana, Calif. Long-line costs range up to $3,000 a month, but they are less than the potential costs of having remote processors at each site, said Ken Buckler, Dometic's controller, who manages the information systems department. Dometic's star configuration suits the needs of its rapidly growing business, which tripled its revenue in the last three years by selling such recreational-vehicle amenities as minibars. Way to go ``We think a centralized system is the way to go,'' Buckler explained. ``It makes maintenance a lot easier, and it makes upgrades easier, so everyone's taking advantage of the latest programs.'' It also allows a small data processing staff of four to support the 1,000-employee company, which is a subsidiary of the Swedish AB Electrolux appliance conglomerate. Buckler said that his small staff also gains great leverage by sticking with standard IBM products _ and with third-party software. ``Most companies are trying to do their own thing,'' Buckler said. ``They write their own programs, and by the time they're through, the programs are technically obsolete. We spend far less than other companies do, but we're getting better results by letting IBM and application software provider System Software Associates, Inc. [SSA] in Chicago cope with the changing technology.'' All the application software was written by SSA, whose Business Planning and Control System (BPCS) contains 25 modules that handle everything from accounts payable and receivable to scheduling, shop-floor control and manufacturing resource planning. Systems analyst Sandy Smith said Dometic has implemented 15 of the BPCS modules. BPCS, which replaced a manufacturing accounting and production information control system on an IBM System/38, is a real-time system that updates all company activities on a day-to-day basis. Dometic's end users are encouraged to use the BPCS to check sales and inventory at will. They are also reminded to print out their own reports rather than ask the DP staff to do so. ``This machine has placed the system in the hands of the end users,'' Buckler said. Dometic's AS/400 Model 60 was installed in early December, replacing an IBM System/38 Model 600. The system was acquired under IBM Credit Corp.'s new AS/400 Total System Lease program and was installed by a single IBM field engineer. The conversion to the AS/400 was made during a weekend; Dometic never even considered running the two production systems in parallel. When one major problem cropped up, Dometic used the AS/400 remote maintenance feature to diagnose an overheated disk-drive component. IBM installed a replacement part the following day. Preparation for the changeover from the System/38 was minimal. ``We assigned two programmers to the night shift for a while, so they could work on a dedicated machine,'' Buckler said. ``Beginning on a Friday night, we started dragging data files across from the System/38 to the AS/400. By Monday morning, we were up and running.'' By Jean S. Bozman, CW staff <<<>>> Title : Kodak flashes image senso Author : CW Staff Source : CW Comm FileName: hardbit Date : Jan 23, 1989 Text: Eastman Kodak Co. said its research scientists recently fabricated an ultra-high-resolution image sensor with four million pixel elements, which it said is more than double the pixel elements of current sensors. High-resolution image sensors are used in high-speed video motion analysis, vision robotics and image processing applications. Kodak introduced a one-million pixel image sensor in 1986. Silicon Graphics, Inc. said it recently shipped the 1,000th Personal Iris. Out since October, the machine is the firm's low-end three-dimensional graphics workstation. McDonnell Douglas Corp. received the 1,000th system, which it will use in commercial jet design operations. National Customer Engineering, Inc. recently an- nounced maintenance support for Sun Microsystems, Inc. workstations and peripherals. The company, headquartered in San Diego, has a total of 33 regional service centers. Briggs & Stratton Corp., a manufacturer of small gasoline engines, recently bought a Control Data Corp. Cyber 960 mainframe and 41 Cyber 910-400 graphics workstations. The CDC system will be used for design and engineering applications. Unisys Corp. won an $8 million contract from United Airlines for a 2200/600 series and a 1100/92 series mainframe computer. The systems will be used as part of the airline's effort to consolide its San Francisco and Chicago data centers into one center in Chicago. The 2200/600 is intended to be used as part of United's Unimatic flight operations application, which is currently running on a Unisys 1100/93 mainframe. <<<>>> Title : Source of power Author : Alison Harris Source : CW Comm FileName: harrisle Date : Jan 23, 1989 Text: Thank you for publishing David Gabel's piece ``Protecting computer power'' [CW, Nov. 28]. However, I think it is important to bring up one point that the article missed. Not all power-related problems will be obvious. Although blackouts and brownouts are sure signals that something is wrong with your power source _ whether internal or because of power company supply _ there are many other not-so-noticeable power-related problems. Random missing data, periodic unnoticeable crashes and other arbitrary glitches often are caused by faulty power. Readers should monitor their power source as a matter of course rather than waiting to pinpoint that it is indeed the cause of their problems. Alison Harris Managing Editor Service News Yarmouth, Maine <<<>>> Title : A real country Author : Gregory Hawrysch Source : CW Comm FileName: hawlet Date : Jan 23, 1989 Text: I thoroughly enjoyed your editorial ``Glass not'' [CW, Nov. 28]. Since I am of Ukrainian descent, however, I feel it is important for your editorial staff to recognize that the use of the word ``the'' before Ukraine is inappropriate. The use of this form suggests that Ukraine is a region when it is, in fact, a country of 60 million people. Respectfully, I, along with millions of Ukrainians in the free world and those under Russian oppression, lose a micro grain of precious dignity left to us as a people each time our country is referred to as a farm plot in the Soviet Union. Gregory Hawryschuk Rochester, N. Y. <<<>>> Title : Intergraph airs another R Author : CW Staff Source : CW Comm FileName: inter1 Date : Jan 23, 1989 Text: HUNTSVILLE, Ala. _ Intergraph Corp. joined the winter harvest of firms offering products for the desk top recently, when it unveiled a new line of reduced instruction set computing workstations and servers. The five-member 3000 series will be based around the firm's C300 Clipper microprocessor, which is capable of processing 10 million instructions per second (MIPS). The firm said the products will offer full binary compatibility, ensuring that software created on earlier generations of Intergraph workstations will run without modification. All of the 3000 series Interpro workstations also include an Intel Corp. 80386 I/O processor and custom graphics processors. Memory options for the series range from 16M to 112M bytes of main memory. Disk options can extend the Interpro's disk capacity from a standard 355M- or 670M-byte capacity to up to 7G bytes. The Interpro 3050 and 3060 workstations will each sport a 19-in. 1M-pixel display screen and sell for $45,000 and $55,000, respectively. The 3070 offers a 27-in. 2M-pixel display screen and is priced at $56,000. All are available immediately. Intergraph officials also confirmed that the company's 300 series workstations can be field-upgraded to 3000 series specifications by replacing the older processor with the 10-MIPS C300 Clipper processor. They added that pricing for the new chip has not yet been established. The pair of servers _ the Interserve 3005 and Interserve 4000 _ were designed for shared-compute and file-server applications. The Interserve 3005 is a single-bay server configured with 670M to 6G bytes of disk capacity; it will sell for $44,000. The two-bay Interserve 4000 was designed for larger jobs and provides 584M to 23G bytes of disk memory; it will be priced at $98,000. The firm also announced price reductions of up to 37% on the Clipper C100 microprocessor, the C300's predecessor. Intergraph spokesmen said the announcement was moved up a few days so it would not occur the same week Digital Equipment Corp. unleashed a series of machines designed for desktop users [CW, Jan. 9]. ``In the PR wars of the world, it's very hard to go up against DEC and all the blabbing that's going to go on with that announcement,'' spokesman George Ralls said. ``This way, we figured we can sit down and talk with our customers and avoid all the hoopla.'' By James Daly, CW staff <<<>>> Title : Finding a balance Author : Dan Kamoji Source : CW Comm FileName: kamojile Date : Jan 23, 1989 Text: I agree with Michael Alexander's article ``Downsizing threatens MIS influence'' [CW, Nov. 28], in which he states that the issue may lie in finding a balance between distributing technology throughout the organization on the one hand and keeping the data center on the other. A changing role of MIS is to build the architecture through which the dispersed systems will function. MIS will also be responsible for managing the connectivity of the organization's networks along with setting guidelines and creating a methodology for using and controlling the network. While connectivity is important, connectivity alone will not do. At early stages of the organization's networks, it is critical to develop and document accepted guidelines for using the network. In such guidelines, it is imperative to document who performs what functions and when. Security issues should also be considered before security violations occur. It should be clearly stated which department is responsible for developing which part of the methodology for using the network. Maintenance issues should be resolved. The second important step is to enforce the methodology throughout the organization. Dan Kamoji Standards Analyst Merchants Service Corporation Indianapolis <<<>>> Title : Honeywell Bull farms out Author : CW Staff Source : CW Comm FileName: multics Date : Jan 23, 1989 Text: Three years after it gave Multics users the grim news that the Multics operating system and hardware would not be enhanced, Honeywell Bull, Inc. said recently that it is turning over service and support of its Multics operating system to ACTC Technologies, Inc. in Calgary, Alberta, Canada. ACTC will support Multics worldwide and assist Multics users in migrating to Honeywell Bull's XPS 100 version of AT&T's Unix, the company said. Multics is a large-scale operating system that once established a small but devoted following of Honeywell users. A number of Multics users were outraged almost three years ago when Honeywell announced at a users' meeting that Multics would not be enhanced further. At a later meeting, some users wore black armbands as an expression of protest. Developed 19 years ago, Multics had advanced features such as a built-in relational database. However, it was designed for 36-bit hardware, which Honeywell dropped in favor of 32-bit systems. Honeywell Bull told users it would offer a Multics-like follow-on operating system by 1988. That project, however, never came to fruition, and it was announced last spring that Multics users would be urged to migrate to XPS 100. ``At the time we made the announcement, we were going to make a revolutionary change to GCOS 6 for our DPS 6 user base,'' said Garry Kaiser, Multics marketing manager for Honeywell Bull. ``Those changes were going to be very Multics-like.'' It was later decided that only incremental changes would be made to GCOS 6. ACTC, which has already been handling some Multics support, will develop software and tools to help users migrate to XPS 100. Support for migration to the Open Software Foundation's OSF1 version of Unix will be offered later, Kaiser said. Honeywell Bull is an OSF founder. By Stanley Gibson, CW staff <<<>>> Title : All this and more Author : Neil W. Plouff Source : CW Comm FileName: plouffle Date : Jan 23, 1989 Text: Regarding Bill Gates' statement on OS/2 memory requirements [CW, Nov. 21] that ``true multitasking won't work in a 1M-byte system,'' perhaps Mr. Gates has forgotten Microsoft Corp.'s product, Amiga Basic, which ships with every Commodore Amiga. The base Amiga configuration includes 512K bytes of random-access memory and 256K bytes of read-only memory. Amiga Basic multitasks quite nicely in this amount of memory. In fact, running two or three moderate-size programs plus a spare command line window is possible on a 512K-byte machine. Neil W. Plouff Bolton, Mass. <<<>>> Title : Another brick Author : David Spencer Source : CW Comm FileName: spencele Date : Jan 23, 1989 Text: I enjoyed reading your recent Viewpoint article by Efrem Mallach about reduced instruction set computing (RISC) technology [CW, Nov.28]. I found him to be right on the mark in regards to the ``RISC marketplace'' being a fictional entity created by marketing enthusiasts. However, Mr. Mallach unfortunately continued to propagate a basic misunderstanding about the actual nature of RISC design. RISC is actually just another step in the evolution in chip design, once again made possible by continuing advances in miniaturization. Chip miniaturization has now reached the point where there is enough room on a single chip to once again build computers entirely from discrete logic. These computers are as full-featured as their microcoded ancestors. But they have a strong advantage because they run the CPU instructions directly, giving them an order-of-magnitude increase in speed over microcoded machines that must first interpret their instructions. Eventually, the day will come when there will be so much space available on chips that we may see operating systems in microcode, on chip main memory or perhaps even entire systems that include device controllers on the chip. David Spencer Sacramento, Calif. <<<>>> Title : Let's have consistency Author : Stanley Gibson Source : CW Comm FileName: stancol Date : Jan 23, 1989 Text: And lo, there was upon the Earth a great noise. And the noise was good. Well, almost. The noise came from DEC in its desktop announcement last week _ one of its largest ever in terms of people present and number of products introduced. The announcement is critical for DEC, which needs a big bang to succeed on the desk top. The company hopes to strike a lightning blow against archrivals Sun Microsystems and IBM and to assert its equality, if not outright dominance. Its strategy of connecting multiple operating systems and desktop devices, while not elegant, may substantially succeed. The linchpin in the connectivity scheme is Decwindows, DEC's X Windows System-based user interface and interoperability scheme. But there were two aspects of the announcement that raised questions about the omnipotence of Decwindows and DEC's skill in getting across a consistent message. First, where was Apple? One year ago, Ken Olsen was praising John Sculley and Apple as kindred spirits in computing as if to say, ``If we made a personal computer, it would be the Macintosh.'' One year later, at what DEC called its most important desktop unveiling and most comprehensive announcement ever, Sculley and Apple were mighty scarce. Macintoshes were mentioned in passing as simply one of the myriad desktop devices to which DEC connects. The absence of Decwindows on the Mac is the reason Apple was not in the spotlight this time. Although the two firms are supposedly still busily at work in joint development, it is unclear what they are working on. The other deficient aspect of the announcement was the absence of Decwindows Display Facility software, which is said to enable PC users to access Decwindows applications residing and executing on other systems. DEC had sent out the message that it was bringing Decwindows to Microsoft's MS-DOS. But DEC pulled its punch, saying that the facility ``will be part of a future release of Digital's PC integration software available later this year.'' There was no price. There was no availability date. And this from a company that claims to ``have it now.'' Indeed, Decwindows will never run on an MS-DOS-based PC; it will run on a server to which the PC is connected. Although Decwindows uses up lots of memory and is too large to be contained in MS-DOS, it could be accessed in expanded memory as virtual disk. A final thought. The Decstation 3100 is based on the Mips Computer Systems reduced instruction set computing microprocessor and runs Ultrix, DEC's version of AT&T's Unix. Ardent Computer makes a ``personal supercomputer'' that uses multiple Mips microprocessors and runs a variant of Unix. Why doesn't DEC just acquire Ardent instead of developing its own multiprocessor version of its Decstation 3100? That way, it could quickly expand its product line and at the same time bring Gordon Bell, the creator of DEC's VAX line who is now working for Ardent, back into the fold. By Stanley Gibson; Gibson is Computerworld's senior editor, software. <<<>>> Title : A true linkage? Author : Ron Wolf Source : CW Comm FileName: wolflet Date : Jan 23, 1989 Text: Regarding ``IBM pushes SAA-Unix links'' [CW, Nov. 28], the wide range of overlapping options makes the choice of application and communication architectures all the more difficult. ``Traditional'' Unix components such as Network File System and X Window System have been announced for MVS, VM and CICS. Also, LU6.2 and other Systems Application Architecture (SAA) components have been announced for IBM's AIX. Before deciding on an intersystem connectivity architecture, the following pair of questions needs to be answered: Which system is best for each application component? Which system will be best in the future for each application component? In the ``AIX Family Definition'' announcement, the following is found in the summary: ``The AIX Family Definition is IBM's long-term strategy for providing, across multiple IBM environments, a consistent set of interfaces, conventions and protocols.'' Following up on this strong statement in the ``IBM AIX/370'' marketing brochure is a section entitled ``Systems Application Architecture Relationship.'' This reads: ``Where conflict exists between AIX and SAA, IBM will give priority to maintaining consistency in the Unix environment.'' Perhaps the ultimate role of SAA-Unix connectivity will be to foster migration of applications from SAA to Unix as even IBM enters the age of the open system and leaves the closed system behind. Ron Wolf Foothill Research, Inc. Belmont, Calif. <<<>>> Title : Leading the pack Author : Dennis C. Wright Source : CW Comm FileName: wrightle Date : Jan 23, 1989 Text: I read the article titled ``Cray lays out product plans for next decade'' [CW, Nov. 21] with interest. Cray Research is again on the forefront of technology. The article indicates that it is even redefining the classical measurement of memory capacity; memory will now be sized in millions of floating-point operations per second rather than in bytes or words. Dennis C. Wright Houston <<<>>> Title : Users say yes to IBM 3990 Author : CW Staff Source : CW Comm FileName: 3990u Date : Jan 23, 1989 Text: Users of the IBM high-end storage controller that shipped at the end of 1988 are already reporting big performance boosts for their mainframe operations. As a result, they say they do not care that IBM failed to meet its original delivery schedule and delayed the more advanced features for the product until sometime later this year. ``Obviously, when the additional features come out, we'll pick up more performance,'' said George Kulman, first vice-president of MIS at Integrated Resources, Inc. in Elmwood Park, N.J. ``Actually, it ended up being an easy phase-in of the system.'' The controller, the IBM 3990 Model 3 with cache capability, was originally scheduled for delivery in the third quarter of 1988. It was to be a major leap in technology from both IBM's earlier 3880 controller and the initial models in the 3990 line. The newest 3990 promised not only to boost basic controller features such as channel speeds but also to add new features such as DASD Fast Write and Dual Copy. In mid-1988, IBM announced that it would not in fact ship in the third quarter and targeted first-quarter 1989 as a new shipment date. In late September, IBM again changed the schedule and said it would deliver the Model 3 with cache capability by year's end. But the DASD Fast Write and Dual Copy, also known as the Extended Features, would still be delayed until sometime in 1989. Despite these delays, users contacted by Computerworld last week said they are impressed with the product because even the base Model 3 without the cache boosted their performance. The cache capability, delivered in December, and the Extended Features, expected this year, are viewed as bonuses, users added. ``First and foremost was the four-way pathing, but we also went to a 4.5M bit/sec. channel speed,'' said Keith Volkmar, director of operations at Blue Cross/Blue Shield of Rochester in New York. Volkmar's previous controller, a 3880, had a 3M bit/sec. channel speed. Many users opted to phase in the Model 3 in three steps, a plan for which IBM offered a small discount of 2% or 3% off the total controller price. Early in fall 1988, a base Model 3, which is actually a 3990 Model 2, was offered. One user jokingly referred to it as the ``Model 3 Minus.'' In December, the microcode necessary to use the cache was delivered. An early support program for the Extended Features is scheduled for the first half of this year. ``We were concerned when they first said it would be delayed to the first quarter of 1989, but then they offered us the discount,'' Volkmar said. ``Since we were coming from a noncache environment, it worked out OK. We didn't need the boost from the cache right off the bat.'' As far as the Extended Features go, Volkmar said, ``I'm looking at it as an extra. When they come, they come.'' Other users are as accepting of the shipment schedule as Volkmar, because each phase brings performance boosts. ``Even running it as a Model 2 had advantages,'' said Mike Dille, managing director of data processing at Fingerhut Corp. in Minneapolis. ``It doubled the number of paths, which itself gave us performance gains,'' he said of the four-way pathing featured in the base Model 3. Fingerhut received the cache capability microcode in December and gained performance boosts ranging from 12% to 20%, Dille said. For example, Dille's staff compiled data that compared I/O operations for an IBM IMS application using a 3880 controller with cache and a 3990 controller with cache. The staff did not change disk drives, which are 3380 Model Ks. With a disk volume of nearly 1.9G bytes, the system performed 12 I/Os per second using a 3880. With the same disk volume and the 3990, 14 I/Os per second were performed. By Rosemary Hamilton, CW staff <<<>>> Title : DEC bundles nomenclature Author : Glenn Rifkin Source : CW Comm FileName: pcdec Date : Jan 23, 1989 Text: If Yogi Berra was out in Littleton, Mass., last Tuesday, he might have muttered, ``It's deja vu all over again.'' For those of us who've been in the business long enough, DEC's glitzy personal computer and workstation announcement last week seemed eerily like a similar big day DEC put on in May 1982. On that day, a beaming Ken Olsen stood on a brightly lit stage in downtown Boston and introduced DEC's contribution to the PC wars _ the Pro, the Rainbow and the Decmate II. He showed off the machines himself, holding up one of the monitors in his huge hand and declaring, ``This set of products has created more enthusiasm, more excitement in the company than I've ever seen before.'' Olsen boasted that the machines had ``an architecture that should last and propagate forever'' and completed his presentation by declaring, ``My reaction is only one: I'd hate to compete with these machines.'' Enough said. It's kind of cruel to bring up this failure as DEC basks in the positive response to its latest desktop announcements. The fiasco that was DEC's PC strategy for the past six years has been well documented. But one couldn't help but notice the parallels on Tuesday. In 1982, DEC unveiled three machines, all running different operating systems and all purporting to be targeted to different audiences. Olsen decided against choosing one PC to represent DEC, claiming that ``the market will decide.'' And as one of his product managers pointed out, ``The market decided, and it chose IBM.'' This time around, DEC introduced seven new machines ranging from a base model personal computer to several high-powered workstations. The machines themselves are impressive, as were the ones in 1982. But once again, marketing questions have leaped up and are begging for answers. Such as, who thought of the names of these varied machines? The PCs are called Decstations and the workstations are called Vaxstations _ except for the main workstation, which is called Decstation. And the machines were given numbers such as 210 and 3520. There were lots of numbers in fact, and not one made any sense. Why, for instance, is there a Decstation 3100 and a Vaxstation 3100? Are they related in any way? Is the customer supposed to be able to differentiate between them? If, by accident, customers order a Decstation 3100 instead of a Vaxstation 3100, they'll end up with an $11,900 reduced instruction set computing-based Unix machine instead of a $7,950 desktop VMS-based unit. I asked Jack Smith, DEC's senior vice-president of engineering and manufacturing, about the names, and he gave me a wry smile and said, ``We're still thinking about the names.'' He admitted that there was confusion and said he hoped that people would keep referring to the products by their well-publicized code names, PMAX and PVAX. I asked Olsen about the names. ``They're as confusing to me as they are to you,'' he replied. Does this kind of planning represent a marketing strategy? If DEC is starting to get marketing religion, why didn't anybody think of better names for these products? Certainly the names won't make or break them, but confusion about the products helped kill the 1982 PCs. Confusion about product identity has never been a characteristic of success. These announcements are crucial for DEC, a signal that the company understands which way the trade winds are blowing. It cannot afford to lose the workstation market the way it did the PC market, and these products make it clear that DEC will be a prominent, if not dominant, player. But the new products stand a far better chance of success if DEC learns to play the name and marketing game. By Glenn Rifkin; Rifkin is a Computerworld senior editor and co-author with George Harrar of The Ultimate Entrepreneur: The Story of Ken Olsen and Digital Equipment Corporation. <<<>>> Title : Battle on field in fourth Author : CW Staff Source : CW Comm FileName: prevue Date : Jan 23, 1989 Text: Analysts last week predicted a computer industry fourth quarter marked by continued erosion of the traditional minicomputer market, heightened user expectations and an impending price war in the fast-track microcomputer sector and triumph for IBM. The marketplace's move toward the desk top continues, noted Timothy McCollum, an analyst at Dean Witter Reynolds, Inc. McCollum predicted that the numbers coming from major personal computer and workstation vendors _ Apple Computer, Inc., Compaq Computer Corp. and Sun Microsystems, Inc., for example _ will be largely favorable. Even in the relatively vital microcomputer sector, analysts expressed more conservative optimism than outright enthusiasm. ``Tandy Corp. didn't have a bad December quarter _ not a great one, but not a bad one,'' said Richard Shaffer, president of Technologic Partners in New York. ``It looks like a good quarter for Compaq, Apple, IBM; nothing should change that.'' However, Shaffer cautioned, a microcomputer user community that is increasingly ``more sophisticated, more demanding and more able to do something about it'' is quickly creating a market that will not tolerate poor quality, poor service or poor price/performance. Breather from buying In addition, according to Redding, Conn.-based computer industry analyst Dale Kutnick, the microcomputer sector is undergoing egregious overcapacity at a time when large corporate customers are taking a breather from buying until prices come down even lower than they already are. Variations on one or more such themes, Kutnick said, already underlie warnings of fourth-quarter losses by Wyse Technology and AST Research, Inc. and presage a pricing war that could leave widespread damage. Rumors that the mainframe market is played out have not reached Amdahl Corp. Well-positioned in the midst of a product rollout, the company ``just completed its best year yet, and there's no reason to believe that the trend won't continue,'' said Jeffry Canin, an analyst at Hambrecht & Quist, Inc. Also standing tall among the giants is IBM, which could achieve double-digit fourth-quarter revenue growth partly fueled by ``pretty spectacular'' shipments of its Application System/400 mid-range entry, Kutnick said. In the mid-range market, ``the big theme is the bloody, competitive battle for the desk top,'' Canin said. Although Digital Equipment Corp.'s fourth-quarter results are expected to be less than spectacular, its workstation announcements last week and the imminent enhancement of the already strong VAX 6200 family clearly position the company among the triumphant, Kutnick noted. The promise of a great 1989 should temper any negative reactions to DEC's December quarter, he added. What can we expect in the workstation niche? ``World War III in 1989,'' according to S. G. Warburg & Co. analyst David Wu. Analysts are looking to billion-dollar market leader Sun for a continuing ascent. Apollo Computer, Inc. should be profitable but not as strong as analysts would like to see it, Dean Witter's McCollum said. Software and services companies overall enjoyed a good, solid quarter, analysts said. Probable high rollers include Computer Associates International, Inc. and Microsoft Corp., which clearly has the strongest momentum in the microcomputer software sector, according to Rick Sherlund, an analyst at Goldman Sachs & Co. By Nell Margolis, CW staff <<<>>> Title : SIA probes health hazards Author : CW Staff Source : CW Comm FileName: sia1 Date : Jan 23, 1989 Text: CUPERTINO, Calif. _ After preliminary results from a University of Massachusetts health study of wafer fabrication employees revealed a higher rate of miscarriages than for a similar group of nonwafer production employees, the Semiconductor Industry Association (SIA) recently pledged $3.5 million for its own research. With participation from 18 chip producers and systems makers with captive semiconductor operations, the study will be conducted on an unspecified number of volunteers from plants nationwide. It will attempt to identify any problems so that companies can rectify them, an SIA spokeswoman said. The Massachusetts study that triggered SIA interest was sponsored by Digital Equipment Corp. and examined 750 female semiconductor operations employees in 1986. Chip production involves a variety of hazardous materials including solvents, arsine gas (arsenic) and heavy metals. Among systems and communications equipment makers participating are AT&T, Northern Telecom, Inc., Hewlett-Packard Co. and DEC. <<<>>> Title : Techie preacher's pulling Author : CW Staff Source : CW Comm FileName: church1 Date : Jan 23, 1989 Text: St. $ilicon, a.k.a. Jeffrey Armstrong, is not seeking a flock of flakes. His Church of Heuristic Information Processing (CHIP) in Santa Cruz, Calif., is for the computer-literate only. The required audience response to his preaching is not ``Amen'' but ``Enter.'' ``How many of you have ever lost data?'' this curious high priest of high-tech asks his audience on the main street of Santa Cruz. ``The church has something to sever your connection with it so the data can go on. It's Purge-atory. The gate to Purge-atory says `Abort? Retry? Ignore?' '' St. $ilicon has been preaching to the technologically converted for nearly three years at users meetings, in board rooms and occasionally from the streets. Formerly Apple Computer, Inc.'s representative to the Middle East, Armstrong said he came to the strange occupation of computer humorist to help relate individuals immersed in computers to the historical and sociological meaning of our technological society. ``No matter what you do, the personal element is at peril in a rule-based system,'' he said. St. $ilicon administers his inspiration from the Giver of Data, or god, to two branches of the church _ Cathod-lics and Geek Orthodox. At the end of their religious teaching, his followers reach Nerd-vana. ``Being St. $ilicon is my way of atoning,'' $ilicon said. ``It's not just a question of comedy. Humor is ultimately philosophical and challenging. Reality is funny.'' Admittedly, some of $ilicon's humor has nothing to do with the church: He's worried about PC-ness envy, a condition in which a user is overly concerned that someone else's computer has more random-access memory than his. He's saved At his pulpit, $ilicon delivers his puns with the conviction of the saved driven by fire and brimstone and in the manner of a pep talk at a sales meeting. ``Has your data been saved? Are you ready to log onto the heavenly host mainframe? For I am the wafer, the truth cable, and no math comes to the flow chart but through memory.'' Launching into his most infamous Sermon on the Monitor, St. $ilicon begins with the Garden of Eden, or the Griden of Readen. ``The Griden of Readen is where Odd-om learned to love hierarchical structures and upper management. The Griden was surrounded by babbling tributaries with wide bandwidths, and out of its assumptions grew many different kinds of tree structures branching out in all directions. And in this spot, man was to be provided with a great variety and a great abundance of code. ``To create Eve-n, the Giver of Data cast Odd-om into a DP sleep. And so it was that Odd-om got Eve-n.'' The serpent, which $ilicon describes as an ``excellent salesman and on commission,'' enters. ``D-worm entered the garden through a serial port and wrapped himself around the tree.'' Once his audience is through groaning, Armstrong hopes that something more than tortuous puns sinks in. Labeling himself the patron saint of appropriate technology, he addresses the industry's underlying technological assumptions. ``At first, when you go to work in the [computer indus- try], you think that decisions are being made to serve the customer. But then you realize that the shareholders are protecting their own assets and interests.'' He uses the original Apple personal computer as an example. He said that the company has never been able to develop a good network because, until recently, it has positioned itself as the anarchist company making the computer for the little guy. ``Why do you want all those people being networked together and dominated by the MIS director?'' he said of the company's early image. Does this make him unpopular with executives? He says that on the contrary, executives are so taken with the humor in his diatribes that they invite him for return engagements. St. $ilicon actually does have a following, and he claims his flock is growing. He delivers a weekly Electron Mass on a San Francisco radio station. He says he has a standing invitation to the Vatican, as well as recognition in the European and U.S. press. Armstrong supports himself through his own company, St. $ilicon, Inc., with speaking engagements and sales of his desktop published Binary Bible. He says he is working on a second edition (not a New Testament) with more egregious puns than the original. Will St. $ilicon let his growing celebrity go to his memory chips? Will his hard disk get too big for his cassock? Will his disciples turn on him and bury him in Basic code, only to rise from storage on David Packard's birthday? Only the Giver of Data and the Internal Revenue Service know for sure. By J.A. Savage, CW staff <<<>>> Title : Tandem lands first OEM co Author : CW Staff Source : CW Comm FileName: ast Date : Jan 23, 1989 Text: CUPERTINO, Calif. _ An approximately $30 million contract with AST Research, Inc. has given transaction processing provider Tandem Computers, Inc. its first OEM agreement in the personal computer arena. Under the terms of an agreement signed last week, Tandem will distribute the Irvine, Calif.-based desktop computer vendor's entire line, which spans Intel Corp.'s 80286-based models to the high-end 25-MHz AST Premium/386. AST products will be distributed under the Tandem label. Starting entries, based on AST's 20-MHz 80386-based Premium configurations, are slated for a second-quarter rollout as Tandem PSX 300A workstations, a Tandem spokeswoman said. For AST, the contract is an affirmation of the company's status as a major OEM supplier, sales Vice-President Bob Becker said in a prepared announcement. For Tandem, it signals a new departure. The company has been selling workstations manufactured by Wyse Technology. However, the Tandem spokesman said, ``We will be winding down that relationship over a period of time.'' It is also the maiden marketing effort of The Tandem Source Co., a recently established corporate division chartered to buy and distribute workstation and terminal products targeted at Tandem customers. Headed by former Tandem sales vice-president Lawrence McGraw, the unit is already negotiating a second OEM contract with an unidentified supplier, the spokeswoman confirmed. By Nell Margolis, CW staff <<<>>> Title : In brief Author : CW Staff Source : CW Comm FileName: 116week Date : Jan 23, 1989 Text: Altai fires back Data center scheduling software vendor Altai Software, Inc., facing a copyright infringement lawsuit by Computer Associates International, Inc., has fired back with its own suit, accusing CA of ``sham litigation'' to eliminate competition. In addition to $68 million in damages, Arlington, Texas-based Altai asked that CA be ordered to divest the former Uccel Corp. operations. ``The only way we can obtain relief is for [CA] to not have monopoly power in the marketplace,'' said Gary Leslie, Altai's chief financial officer. ``Altai's position is absurd,'' said Michael McElroy, CA's assistant vice-president and secretary. He charged that Altai's suit ``is based on a theory that it is all right for a small competitor to steal source code from a larger competitor, but when the larger competitor complains, that's a violation of the law.'' CA's original suit halted the proposed buyout of Altai by Goal Systems International, Inc. Spring into multiple megabytes Fremont, Calif.-based Springer Technologies, Inc. last week received financial backing of an undisclosed amount from an investment group led by controversial computer industry entrepreneur and billionaire H. Ross Perot. Springer is developing ``multiple megabyte'' computer memory-disk technology. In a licensing agreement with Early Venture Investors, Inc., Springer, through a joint venture company, will manufacture and market a wholly integrated head, flexure assembly and circuit on a chip _ reportedly an industry first. IBM buys Corning subsidiary stake IBM announced a 25% equity investment in Chatsworth, Calif.-based PCO, Inc., an optoelectronics subsidiary of Corning Glass Works and manufacturer of devices that allow high-volume data transmission on fiber-optic cable. The investment price was not disclosed. Decision decides on FDRDecision Data Computer Corp. recently made a major move to boost its third-party maintenance business by acquiring FDR Field Service Co. and a subsidiary from First Data Resources, Inc. FDR services point-of-sale systems and CPUs. Decision Data will fold FDR into its Decision Data Services, Inc. service unit, one of the four largest third-party maintenance providers in the U.S. <<<>>> Title : Apollo gains marketing ma Author : CW Staff Source : CW Comm FileName: migliore Date : Jan 23, 1989 Text: Apollo Computer, Inc. in Chelmsford, Mass., has taken some hefty hits from analysts and the press in the area of its marketing ability. Last week, the organization put some muscle into its oft-stated intention to toughen its marketing arm, introducing former Texas Instruments, Inc. and Fairchild Semiconductor Corp. executive John Migliore as its new vice-president and general manager of worldwide marketing and strategic planning. Migliore has headed his own consulting firm, Creative Strategies, Inc., since 1987. He will report directly to Apollo Chief Executive Officer Thomas Vanderslice. In his fourth day on the job, Migliore characterized himself as a man with a relatively easy mission. ``A lot of companies wish they had the name recognition and the superior products that Apollo does,'' he said, noting that increasing the market's perception of such advantages will be largely a matter of ``consistency and repetition.'' The new marketing leader will get an early and auspicious public debut. On Feb. 1, he will keynote the U.S. introduction of Apollo's eagerly awaited graphics component for its DN10000 superworkstation, while Vanderslice hosts a simultaneous European introduction in Zurich. David Burdick, who follows the workstation market for San Jose, Calif.-based Dataquest, Inc., saw both the creation of Migliore's job and the selection of Migliore to fill it as positive signs. ``You don't get the luxury of a learning curve in the workstation market,'' he said. Burdick said that a veteran of the semiconductor industry already knows many of the lessons that need be brought to bear in marketing workstations. Ashton-Tate Corp. has a newly created position _ vice-president of consulting and training services _ and the man for the job is the former vice-president of the firm's U.S. and Canada division, Barry Obrand. He will be succeeded by IBM veteran Wes Richards, who comes to Ashton-Tate after two years as vice-president of sales and marketing at Mountain View, Calif.-based Software Publishing Corp. Come March, look for Apple Computer, Inc. Senior Vice-President of Sales Charles Boesenberg at Sunnyvale, Calif.-based reduced instruction set computing maker Mips Computer Systems, Inc., where he will soon become senior executive. Taking over Boesenberg's post at Apple is former Apple Vice-President and General Manager of Northwest Operations William Coldrick, who will hand over his old desk to former U.S. Sales Development Director Godfrey Sullivan. Management Science America, Inc. has a new executive vice-president and chief financial officer. Former Big Eight certified public accountant and Electromagnetic Science, Inc. executive William F. Evans joined the organization recently. By Nell Margolis, CW staff <<<>>> Title : Publishing centralization Author : Larry Stevens Source : CW Comm FileName: pubbox Date : Jan 23, 1989 Text: While the trend in many companies is for corporate electronic publishing departments _ usually through MIS _ to create closer links to other areas of the organization, some firms are moving in the opposite direction. Especially in companies in which the publishing unit has not been able to provide adequate services for the user community, some departments are taking it upon themselves to create miniature corporate electronic publishing operations that are reminiscent of the distributed data processing movement. One such example is at the Cambridge, Mass.-based Automation Applications Division of the U.S. Department of Transportation. According to Project Engineer Richard D. Wright, his organization does have a publishing department _ although it is not electronic _ that had produced most of the publications that his division submitted. But a problem arose when Wright found himself with a two-month deadline on a 1,500-page document for the Federal Aviation Administration. ``If we send it to the [corporate] publishing department, we have to get in line behind whoever got there first, and then when we get the project back, it's never what we wanted,'' he says. Because Wright already had Sun Microsystems, Inc. terminals, he was able to buy Interleaf, Inc. software and be up and running in time to meet his deadline. Since that time, he has purchased 15 more copies of Interleaf's Technical Publishing Software 4.0 program, and most of his engineers create original drafts on them. Now, Wright says he is a true believer of decentralization and feels that this will become a trend. ``I can now outrun any mainframe with about one-tenth the cost and do it better,'' he says. Wright adds that he feels he has the ideal setup because he has placed the output source nearer to where the input source is. Because of the technical nature of the material his engineers work on, disastrous results can happen if there are errors, Wright explains. Yet long columns of numerical data, which he typically uses in his reports, are just the type of data that typesetters have trouble getting right. And because Wright's documents generally include not only text but also many different graphic elements, including diagrams, flowcharts and schematics, it is much easier to manipulate them on a screen than to hope a printer lines them up correctly. Finally, because the data is constantly changing, having the operation in-house allows Wright to alter data up to the last minute. By placing publishing and document creation in the same hands, Wright says he is able to decrease turnaround time by one-quarter and increase accuracy. While Wright does use MIS to give him spreadsheet and database functions from the mainframe, he says MIS is not involved in any aspect of the publishing operation. This new independence required purchasing some equipment in increments to avoid the requirement that MIS sign off on it. LARRY STEVENS <<<>>> Title : Ask the vendor Author : CW Staff Source : CW Comm FileName: askpub Date : Jan 23, 1989 Text: When I import Claris Corp. Macdraw images into Aldus' Pagemaker and print out the file, the printed images are all broken up; the image placement doesn't even match what's on the screen. How can I fix this? Gary Reymond Advertising Program Manager SCM Office Supplies Group Marion, Ind. ALDUS CORP.: The problem is inherent in the Pict format in which Macdraw images must be saved for placement in Pagemaker. Pict images have a resolution of 73 dot/in., which higher resolution printers have trouble interpreting correctly. Enlarging the image in Pagemaker magnifies the problem. For best results, import graphics such as those drawn in Aldus Freehand or Adobe Illustrator in the Encapsulated Postscript format, which solves these alignment and distortion problems. <<<>>> Title : MIS role to grow Author : Larry Stevens Source : CW Comm FileName: pubbox2 Date : Jan 23, 1989 Text: What does the future hold for corporate electronic publishing? According to Anthony Deakins, vice-president and general manager of the Composed Output Applications Division of Cincinnati Bell Information Systems, future advances in technology will naturally bring about changes in corporate electronic publishing. He sees such advances as checking account statements with scanned images of checks and more graphs with telephone and utility bills. But he adds, ``The first step is to find reasonable ways to store and transmit disparate forms of data including object-oriented and bit-mapped graphics along with text.'' One innovation in this area occurred in September with Digital Equipment Corp.'s release of the specifications for its Compound Document Architecture (CDA). The CDA specifications define a networked environment for creating, revising, managing and distributing compound documents containing links to text, graphics, images, spreadsheets, charts and tables. These links are ``live'' in the sense that the data is automatically updated when the source data is updated. Deakins says corporate electronic publishing today is more concerned with fonts and line spacing than with networking and direct-access storage devices. But when CDA becomes implemented and custom documents that require extensive use of the database become more common, electronic publishing will gradually move closer into MIS' purview. ``Right now, MIS has to be ready to provide the networking and storage support for corporate electronic publishing,'' he says. ``As the technical issues become more complicated, it will have to do a lot more.'' LARRY STEVENS <<<>>> Title : New thrust brings CEP clo Author : Larry Stevens Source : CW Comm FileName: publead Date : Jan 23, 1989 Text: Bonnie Whidden is willing to defer to MIS, but only up to a point. Whidden, a graphics and administrative services manager at Kawasaki Motors Corp. U.S.A. in Irvine, Calif., reports to information services and was, in fact, a data processing staff member before shifting into her current post. However, she prefers to think of the MIS role in her department's electronic publishing operations as an advisory, rather than a supervisory, one. ``MIS acts as a consultant for us,'' Whidden says. ``They are part of a research committee and implementation committee. They have their expertise, we have ours, and other areas of the corporation such as marketing also need to have input.'' Robert Shepard, vice-president of information services, basically agrees with that assessment. His primary concern, he says, is to make sure that the corporate electronic publishing (CEP) department fits in with the rest of the organization. ``There is a real productivity gain if the text and graphics files created by the various user departments can be transferred either via network or through disk to the CEP area,'' he explains. Although Shepard says he does not make the final decision about what hardware and software equipment the CEP department should buy, he does examine purchase requests to make sure the equipment is compatible with what other users are working on. ``We're looking for something that is good for the entire organization vs. people making individual decisions,'' he says. So far, that mission has not conflicted with the interests of the electronic publishing operation. ``Everything they've requested has fit into our compatibility requirements,'' Shepard says. Unfortunately, that degree of agreement is somewhat rare in this uncertain period, when MIS organizations are only starting to figure out what their responsibilities should be in relation to expanding corporate electronic publishing operations. Through the back door Like personal computers in the early 1980s, corporate electronic publishing equipment has often arrived in corporations through the back door _ with departmental funds and without MIS involvement. For a long time, the impact of these systems was minimal and could be safely ignored. Now however, corporate electronic publishing is evolving into an enterprisewide activity and being accorded strategic importance. ``It's a $3 billion business this year that has grown from almost zero a few years ago,'' says Vera Allen-Smith, an industry analyst at market research firm Dataquest, Inc. in San Jose, Calif. ``Companies are taking seriously the concept of the competitive document that says that if the content of my document and your document are the same, but mine looks better, I win.'' As a result, MIS has virtually no choice but to become involved at some level. ``MIS is being pulled into corporate electronic publishing reluctantly, just as they were pulled into the PC arena,'' says Virginia Campbell, director of the electronic publishing center at the Dallas Infomart. MIS managers are not necessarily anxious to get involved with publishing activities, Campbell says, but they also are not about to make the same mistake they made with PCs _ letting the technology grow wild and without standards. ``They don't want that to happen to them again. So they're asking questions,'' she says. Different questions Not surprisingly, the questions that MIS managers ask are not the same questions that graphics and publishing people ask, according to Campbell. ``MIS wants to know about security, file management and power,'' she says. ``It's important information and things that the graphics people don't ask about. I think there could be a good marriage between the two as long as jealousies don't get in the way.'' Whether it is jealousy or just different points of view, the relationship between MIS and electronic publishing can get off to a rocky start. At GTE Corp. in Thousand Oaks, Calif., for example, Mel Taylor, supervisor of composition, says MIS' control of a new system purchased two years ago nearly produced a disaster. ``It was MIS' job then to assimilate our wish list, research the type of equipment that would best satisfy our needs and make recommendations to management,'' Taylor recalls. But, he adds, ``They didn't do a very good job of it. This is not to denigrate them, but it's just that they were used to looking at different things than we were.'' According to Taylor, MIS ignored most of the composition issues that he considered important and concentrated instead on information such as the number of pages (200,000) the department would have to process each year. Using hard numbers as a guide, MIS determined the best system that could meet the requirements considering cost, efficiency, speed and capacity. Although MIS did an excellent job of meeting its own parameters, Taylor says it downplayed issues such as the number and types of fonts, the ability to kern or manipulate line endings and output to a typesetter equipment. ``The system it proposed wouldn't even do the same job that the system it was replacing did,'' Taylor says. Taylor's requirements were taken into account after further negotiations; now, he says, relations with MIS are good. But then again, the MIS department does not have the authority that it originally had. ``MIS advises us and helps us. But we have the final decision,'' Taylor claims. That situation highlights a problem of growing urgency. As CEP becomes a more important part of the firm, it is not really clear in many organizations who controls the system. Herman Prescott, vice-president of SEI Information Services, an Atlanta-based consulting firm that helps set up CEP operations, says that the real difficulty is that everyone feels entitled to some input. ``Everyone _ MIS, marketing, public relations _ feels that they should have a say,'' he notes. Certainly MIS' claim is neither entirely clear nor well-established in most organizations. Many firms start with PC-based electronic publishing systems and then graduate to workstations tied to minicomputers or mainframes. Some of these systems are later connected to other users or the corporate mainframe through a network. As a system grows, and as networking and corporate standards become necessary, MIS involvement may also grow. The extent of that involvement differs greatly from organization to organization. ``A lot of companies are now at the point where they are trying to standardize both the forms that their publishing departments will create and the software _ such as spreadsheet or word processing or page layout _ that users throughout the corporation use to create those forms,'' Dataquest's Allen-Smith says. ``Standardization will eventually make possible the kind of connectivity that MIS is used to and that will require more MIS involvement.'' Many electronic publishing managers are willing to accept MIS assistance in those areas in which their activities intersect with other parts of the corporation. However, most strongly resist the idea of overall supervision from MIS or encroachment from that quarter into the day-to-day operations of publishing units. Even at Kawasaki, where the ties between MIS and corporate electronic publishing are stronger and more direct than ordinary, it is not quite clear what might happen if the relationship were tested by the kind of conflict that occurred at GTE. ``So far, we haven't had to make any sacrifices, since everything CEP requested did fit into our compatibility requirements,'' Shepard says. ``But if necessary, we would veto equipment if it didn't fit in.'' Even if Whidden, who comes from a data processing background, were inclined to be more compliant, GTE's Taylor is more typical of publishing managers in general. He comes from a graphics, rather than a technical, background. Taylor's technical sophistication has grown as his department has grown, and one of his major responsibilities now is systems management. His department is responsible for producing 8,000 different forms, plus display ads, invoice stuffers, public relations material and proposals. Such on-the-job learning is possible, explains Frank Romano, editor of Typeworld, a Salem, N.H.-based publication for users of electronic publishing systems, because vendors recognize the technical limitations of corporate publishing staffs and have emphasized ease of use and ease of maintenance in their designs. In many organizations, corporate electronic publishing grew out of a nonelectronic counterpart, in which people who may have been hired to run typesetter equipment or even do proofreading are now called on to maintain a computer system. Jim Gayan's operation at North Carolina State University in Raleigh is a typical example of just how far publishing managers are willing to let MIS into their operations. Gayan is the service section supervisor in a department that prepares camera-ready copy for all pamphlets published by the North Carolina Agricultural Extension Service. He uses the university's MIS department for consultation and direction, but for day-to-day operations, he has a system manager who was formerly a journalist. ``If you're working in electronic publishing, you have to know electronics,'' he says. ``For us, a journalist is no longer someone who hunts and pecks on a typewriter; he has to understand computers.'' Romano says that there are two ways MIS typically gets involved in CEP. Only one way, however, is right. He says, ``If MIS gets to choose or control the equipment simply because it has computers in it, that's wrong. Cars are full of computers these days, but you wouldn't have MIS select the corporate fleet. Electronic publishing is more involved with publishing than electronics. The right way is for MIS to help coordinate the publishing department with other areas of the corporation and to help with networking.'' That is, in fact, the point at which most MIS involvement seems to be concentrating at the moment. Gayan, for example, is now working closely with MIS for the next phase of his operation, which is to link all the offices at the university that send him data through a LAN. ``But,'' he says, ``we put the twisted-pair cabling in place mostly by ourselves. This is primarily our project, and we are learning how to do it.'' At Acurex Corp. in Mountain View, Calif., Sarah Paralo runs a 22-person CEP operation that supports all four of Acurex' major divisions _ aerospace systems and equipment; aircraft galley refrigeration equipment; the data acquisition, measurement and control division; and the environmental division. Her story is a similar one of day-to-day independence and selective alliances with MIS. The department's current systems adminstrator, who was formerly a typesetter, is responsible for maintaining the department's Apple Computer, Inc. Macintoshes, IBM PCs, Sun Microsystems, Inc. terminals and Docupro, Inc. software, as well as doing some Unix programming. Paralo does, however, call on MIS for consultative assistance in two areas. One problem she needs help in relates to the production of proposals, which requires tapping into the corporate database. For that purpose, Paralo has depended on MIS to set up the links and security safeguards so that her department can get the data it needs. Electronic generation of proposals is already critical and is likely to become more so as time goes on, Paralo says. ``In the proposal area, CEP is no longer a luxury,'' she says. ``Now that the government knows that our industry is capable of producing it, they are becoming so demanding that we simply couldn't reasonably create a proposal using an outside printing house, even if we wanted to.'' Paralo is also working with MIS to maintain the Kermit and Ethernet network links between her department and the company's 130 Digital Equipment Corp. VAX terminals. The network allows her to receive and send information needed for the development of publications; in addition, it enables her to communicate with those in the organization who use her department's services. She also expects MIS involvement to grow as more people in the company adapt to doing their work electronically. ``While some engineers are totally on keyboard, some still handwrite or cut and paste,'' she says. ``Before MIS involvement grows, the engineering, marketing and other departments that feed us things have to have data in electronic form. MIS can't help us if our users want to scribble something on paper and send it to us by messenger.'' Paralo sums up her relationship with MIS this way: ``MIS helps get the material to us. We're responsible for formatting the document, editing it, [providing] technical writing support if necessary, assigning photography, creating a binder and interfacing with the photo lab and art department. There is no way anyone can even imagine that any of that should be under MIS' purview.'' While MIS does have to sign off on hardware and software purchases, Paralo says that this requirement rarely precipitates a conflict. The relationship between CEP and MIS is also a good one at General Atomics, Inc., a San Diego-based company that does research in nuclear energy. But that affinity is probably to be expected, because CEP is handled there through the company's information center. What is more remarkable in that circumstance is the degree of separation that can be observed. According to Nancy Hodder, manager of the information center at General Atomics, MIS and the information center have clearly defined roles. MIS controls the data, and the center controls the output. All the data is stored on the corporate mainframe and all proposal writing, which is the major publishing application, is done on Microsoft Corp.'s Word-II, also on the mainframe. The software, hardware and connections to the mainframe are controlled by MIS, but once the data leaves the mainframe and goes to General Atomics' Interleaf, Inc. Technical Publishing Software system, it becomes the responsibility of Hodder's department. ``We take care of all installation, communication and repair of the terminals as well as teaching courses to others in the corporation,'' she says. ``We get a lot of support from MIS when we have to interface with their equipment.'' It remains to be seen whether more active MIS intervention in electronic publishing will become necessary. SEI's Prescott says that many organizations are now considering the creation of a new position called chief publications officer, who will have overall control of the operation. It is not expected that this person will either emerge from or report to MIS, but instead will likely look to that department for consultative assistance and additional support. By Larry Stevens; Stevens is a free-lance writer based in Springfield, Mass. <<<>>> Title : So your users want a PIM Author : Alan Radding Source : CW Comm FileName: pimid Date : Jan 23, 1989 Text: Comdex may be the ultimate metaphor for the information industry, encompassing both the great promise and overwhelming burden of information systems: There is so much information that one person cannot assimilate it. Adam Rostoker appreciates, better than most, the overwhelming amount of information that surrounds the Comdex exhibition and conference. Last year, Rostoker worked for The Interface Group, Inc., which runs the show. As conference manager for the spring event, he had to juggle the needs and demands of the exhibitors, visitors and show managers. He had to concern himself with everything from the trivial to the momentous. Managing Comdex is enough to drive anyone slightly crazy. So to cope with the incessant flood of information, Rostoker turned to his Intel Corp. 80286-based NEC Corp. personal computer and a new category of software called personal information managers (PIM). The particular PIM that Rostoker used was Info-XL from Valor Software in San Jose, Calif. ``I handled all of Comdex last spring with Info-XL and word processing software,'' says Rostoker, who is now the executive director of the Windows/Presentation Manager Association. During Comdex, with the help of Info-XL, Rostoker says he was able to keep track of the endless number of people, projects, due dates, meetings, deadlines and varied bits of constantly changing information that is required to mount the giant trade show. Today, in his new position, Rostoker uses another product _ a Microsoft Corp. Windows-based PIM called Packrat from Polaris Software in Escondido, Calif. From soup to nuts Like running a large trade show, managing a United Way campaign within a large organization is a chore that can drive even the hardiest manager a little bit nuts. But after doing such a good job of organizing the United Way effort in his division last year, Robert Rubin, vice-president of MIS at Pennwalt Corp. in Philadelphia, was rewarded with the task of doing the same thing in 1988 throughout the entire company, an international chemical firm with $1 billion in sales. Rubin turned to another PIM, Grandview from Symantec Corp. in Mountain View, Calif., to help him with his return to the United Way drive. ``I wasn't going to use it simply [for] a general to-do list or anything like that. I used Grandview as a strategic tool,'' he explains. With the product, Rubin found he was able to pull together the bits and pieces of reports and documents that he and others had previously developed. Next, he combined that information with more recent in-house memos and database documentation to create the documents that detailed the strategy for the latest campaign. He then circulated the resulting strategic plan to other senior company executives for action and feedback, which was incorporated, via Grandview, into the next iteration of the strategy. Yellow sticky-note deluge Today, people are inundated with information. The flood, driven in large part by computer systems, overburdens the conventional methods people have long used to manage the everyday flow of information. Desks are now cluttered with lists on yellow-lined sheets of paper, brief notes stuck all over the telephone, clippings, calendars obliterated by meetings scheduled and changed repeatedly and reports piled on top of each other. To deal with the ever-growing problem, PC users are, indeed, turning to PIMs. ``Personal information managers are the first products specifically designed to solve a problem that computers created _ information overload. We can create it faster than we can assimilate it,'' claims Valor Software's Steve Sando, the developer of Info-XL, in a research paper on the general subject of PIMs. Complicating the job of information management is the fact that people absorb their information in so many different forms simultaneously. Rarely does an individual work only with text or numbers or neatly structured files. ``We all have lots of different information: lists, documents, spreadsheets, notes. That's always been the problem,'' says Jeffrey Tarter, publisher of the ``Softletter'' newsletter in Cambridge, Mass. ``When we're involved in a project, we use it all.'' If you need to manage numerical data, there is the spreadsheet. If you work with text, there are word processors and outliners. For structured data, there are a range of database products from simple file managers to full-powered relational databases. Each category of software manages one type of information. ``So far, all we've really done is automate the pieces,'' Tarter notes. Even integrated software products that provide the smooth flow of data from one application to another do not fully solve the issue of multiple types of data. ``The user needs to relate letters, memos, notes, names, companies, topics and dates. Integrating applications provides only a very small portion of the answer,'' Sando writes. Users still must make the connections between different types of data. He points out that instead of integrating applications, the emphasis must be on integrating data. Personal information managers are an attempt to automate and integrate the entire flow of data. The software accepts data in any form from any source and allows users to access that data in whichever ways they choose. The PIM, with minimal input from the user, automatically structures the data. Surging interest A sudden surge of public interest in PIMs developed last year as the first fully functional PIM products _ such as Lotus Development Corp.'s Agenda, Dayflo Software Corp.'s Dayflo Tracker and Info-XL _ appeared. ``These [types of] products transcend the traditional boundaries'' between existing software categories, Tarter notes. Until the advent of PIMs, users were forced to ``interface with the computer in a flawed manner,'' Rostoker claims. Users tried to force their particular problems into the structure of whatever application they were most comfortable with, such as the spreadsheet or a database. ``PIMs break through that barrier,'' Rostoker continues. ``They are more task oriented.'' With a PIM, the goal is managing information rather than creating a database or a spreadsheet. Predecessors to the current PIMs have been available for some time. Programs such as Borland International's Sidekick, Asksam Systems, Inc.'s Asksam and a variety of other products offer limited PIM-like capabilities. ``There are a host of products to manage the notes that accumulate,'' Tarter says, but these products all lack the cross-referencing capabilities that are the essence of a true personal information manager. PIMs, ideally, are software products that allow the user to input information of any size and type from any source and then extract portions of that information in any sequence, depending on connections either the user or the system has made. Personal information managers provide the ability to store, retrieve and sort any data to allow the user to draw inferences and make connections that were not explicit at the time of the data's entry. For instance, the same information entered randomly from a variety of sources concerning people, dates, projects, meetings, significant data, financial projections and competitive reports can be returned by the PIM as a prioritized list of things to do, a project timetable, a profile of an individual's activities or progress, a marketing analysis, a sales forecast or something else, depending on how the user wants to view the information. What they're not PIMs are not likely to replace spreadsheets, databases or word processors, although various PIM products have strong elements of those applications. Instead, the current expectation is that information managers will share the computer with one or two of those other primary applications and be used in conjunction with them. For instance, databases are best for dealing with large quantities of highly structured, relatively static data, but they are ``too structured to use to synthesize changing data,'' explains Andrew Hammond, product marketing manager for Lotus's Agenda, one of the industry's leading personal information managers. Word processors were designed primarily to create data, not organize and retrieve it. So, too, outliners do not have the power to do much more than duplicate what users already do on a notepad. Spreadsheets, in some ways, are closest to PIMs, because the spreadsheet gives the user the capability to relate one piece of data to another and change the data and relationships at will. Unfortunately, spreadsheets can only exercise those capabilities on numbers for quantitative analysis. The most likely scenario for the widespread use of PIMs is that they will exist in a synergistic relationship with the current popular applications. Most PIMs provide the capability to import spreadsheet, word processing and database files into the PIM. The more sophisticated information managers, such as Persoft, Inc.'s Ize, offer macro-like capabilities, called hot links, which will automatically retrieve data from outside sources. Hammond says he sees the spreadsheet and the personal information manager operating in a left-brain/right-brain scenario, depending on whether the user's immediate need is for strict numerical analysis or for a broader, more intuitive view. PIMs are directly related to the concept of hypertext. Essentially, hypertext refers to the interconnection of static information in nonlinear or unstructured ways. ``Hypertext creates a network of links so that you can traverse information in a variety of ways,'' explains Persoft Chairman Ed Harris. Hypertext programs basically rely on multiple keywords to identify each item of information. A basic fact can be tagged with a keyword designating its subject area, its source, its time- and date-stamp or anything at all. The user can then turn to a keyword or combination of keywords to view what otherwise appear to be unrelated items. Hypertext is not a new idea; it has been kicking around the mainframe industry for several decades. The technology got a big boost, however, with Apple Computer, Inc.'s introduction of Hypercard in 1987. While Hypercard utilizes hypertext, it is not a PIM in the way that Agenda, Info-XL, Grandview, Dayflo Tracker or a growing number of other offerings that have come to define the category are. Hypercard is more like an unstructured database that allows the user to create hypertext links between elements in the database. But it is the hypertext qualities of PIMs, along with other advanced features such as rudimentary artificial intelligence, that separate the new products from their predecessors. ``The first generation of information managers simply presented users with a linear list,'' Harris notes. Users got from the program only what they put into it, in generally the same form. The latest generation of PIMs, however, has added intelligence, which allows the programs to automatically reorganize and restructure the information, resulting in unanticipated connections, he adds. Leaders of the pack Although it is still too early to rank PIM products by market share and sales, several products have emerged as the leaders of the pack. ``With a $6 million advertising budget, Agenda is really defining the category. It is our biggest seller,'' says Lou Levine, product manager at Corporate Software, Inc., a Westwood, Mass., software distributor specializing in corporate sales. Other major players at this early point include Ize, Grandview and Info-XL. A number of others, such as Dayflo Tracker, Conductor Software, Inc.'s Act, Asksam and Microlytics, Inc.'s Gofer may also be regarded as PIMs, but these products have not caught on the way that the market leaders have. ``Agenda, Grandview and Ize are the biggest sellers,'' Levine notes. Ironically, the three leading offerings are the least alike, and in some sectors, Grandview and Ize may have trouble qualifying strictly as personal information managers. ``Ize came out of text-based retrieval systems, and we added innovations like keyword outlining and hot links,'' Persoft's Harris says. Grandview is more correctly termed an outliner, the direct descendent of Symantec's Think Tank. Products such as Dayflo Tracker, which relies heavily on a forms approach, and Act, which is highly structured for the sales environment, may also be categorized as personal information managers, but analysts see them filling more specialized niches. ``They do a better job for some particular problems, but they are not really general purpose,'' Tarter says. What do general-purpose information managers do? Although project management is considered a key task for PIM applications, the programs are not comparable with formal project management software. ``PIMs are used to manage a mass of information, but not in the structured way classic project managers do,'' Tarter points out. PIMs are ``terrific for project planning, but not project management,'' says Gary Josie, executive vice-president of Valor Software. By project planning, Josie means the events leading up to the formal start of the project. Project management, on the other hand, refers to critical-path scheduling and other formal project management activities, according to Josie. And who would use an information manager? In many cases, the first PIM users are the corporate users who already use spreadsheets, databases and word processors. ``Our customer is probably a manager in a group with several people to oversee and maybe several different projects going on,'' Josie says. Analysts and marketers, however, suggest that eventually the products could attract broad classes of new users to personal computing. ``Two of Info-XL's biggest markets are detectives _ the FBI _ and ministers,'' Tarter reports. ``Neither of these two jobs are traditionally computerized.'' Ministers are reportedly using the program to manage varied information about their congregations, whereas the detectives, presumably, are using the program to sort through the piles of information that they accumulate in the course of an investigation. The marketers hope that new users will be drawn to the PIMs, thus expanding the universe of desktop computer users, but that has not happened yet in any large sense. Rostoker points out that there are two old computer problems that still must be overcome if PIMs are to attract large numbers of new computer users. First, there is the problem of keying in data, and second, there is the more basic problem of access to a computer. If a person does not have regular access to a computer or is not willing to key in data, a PIM will not be helpful. Peter Piper picked a PIM Picking a PIM, at least until the category becomes cluttered with me-too programs, is relatively easy, because there are clear differences between the programs. Each PIM comes from a different prototype and takes a slightly different approach to the problem of information management; therefore, each product is better suited for some tasks than others. Agenda comes very close to Lotus's 1-2-3 metaphor and comes out of the need to manage people and projects. Ize arose from text retrieval. Dayflo Tracker comes from filling blanks in forms. As a result, each product offers its own strengths and appeals. ``I use Grandview. I love its outlining,'' Harris admits. Agenda is better than Ize for managing a project, he adds, but Ize cannot be matched when it comes to managing large amounts of data. The Ize have it Bob Didner, senior staff consultant at Dun & Bradstreet, Inc., in Berkeley Heights, N.J., presents a classic example of what Ize can do. He spends most of his time researching and writing reports. For these projects, he might download several hundred abstracts from on-line information services like Dialog. Rather than wading through all the information piece by piece, Didner uses keywords through Ize to catalog and classify the massive amount of information. The program will then automatically outline all the data in hundreds of abstracts. Before he had Ize, he manually outlined his projects, put data on individual cards and then physically arranged the information. ``I had scraps of paper all over. I had Dialog printed out on paper. It was only sequential,'' Didner recalls. ``Then I would use an outliner, but it was too constraining.'' With Ize, he can generate tremendous amounts of material without regard to structure and organize it later. Mary Dolce, by comparison, is a typical Agenda user. As a systems consultant at Baxter Healthcare Corp. in Deerfield, Ill., and the president of the Chicago Computer Society, Dolce must keep track of many small pieces of information, some of which relate to her clients and some of which relate to the computer society. Before she had Agenda, she ``would write a lot of notes and then lose them,'' she says. Agenda is particularly valuable to Dolce, she claims, because most of her material is very short and much of it is date-specific. ``I have to track 20 computer society meetings each month and keep up with the volunteers,'' she explains. Dolce says she also considered Ize but rejected it because it was too text-oriented. Agenda, however, allows her to enter her short pieces of data in a free-form fashion and then structure it later. Agenda automatically tracks dating, even with indirect references such as ``next week.'' Grandview and Info-XL come to information management through the outline metaphor. ``If you are looking for something to help you organize and present information, then that's Grandview. If you want something that will discover hidden relations between things, then you probably want another product,'' says Pennwalt's Rubin. Like the spreadsheet before it, the PIM is making its way into the corporate environment as a small grass-roots movement. ``The more proactive users are starting to talk about Agenda,'' says Leslie Fiering, assistant vice-president of advanced technology at Bankers Trust Co. in New York. And, prodded by executive interest, Virginia Talamo, microcomputer technologies manager at Coopers & Lybrand in New York, is starting PIM demonstrations and beginning the new software evaluation process. While it is too early to tell if one of the current products will succeed as the corporate standard, early betting is on Lotus' Agenda, which can ride on the coattails of 1-2-3 _ a product that is already pervasive throughout the corporate environment. Fiering predicts that organizations will eventually enforce a flexible standard with regard to PIMs. ``To the extent that you need to standardize to receive the economies of scale, there will be a corporate standard,'' she says. That particular standard will be enforced more through the availability of support, templates and macros rather than by corporate command. ``I can't see MIS worrying [about] how people keep their personal phonebooks,'' she concludes. Corporate PIMheadsThe key to acceptance in the corporate environment is the rise of personal information manager power users, which Rostoker calls ``PIMheads.'' The power users will develop the templates, macros and various structures needed to pull out information. These structures will become standard within each organization, much the way 1-2-3 macros have. To speed the process, the major marketers are touting add-on products that will give users immediate access to effective structures for specific applications. Indeed, a marketing battle is building for dominance of the PIM arena, which the manufacturers regard as potentially as large as the spreadsheet niche. ``Lotus is starting to dominate, and if the trend continues, Agenda will probably become the main product,'' Corporate Software's Levine says, ``although from a function standpoint, that shouldn't be the case.'' Other industry observers suggest that one PIM will never dominate because user needs are too varied for one product to effectively fit all users. ``There will be four or five dominant PIMs because people will use different products for different reasons,'' predicts Dayflo President Bob Gilchrist. Like word processing, as many as a half-dozen preferred products will coexist at the top of the market without a clear-cut leader, he says. While it is too early to predict whether personal information managers will develop into a standard analytical tool and become as pervasive as spreadsheets, the category is destined to quickly find its way into public consciousness: Agenda even plays a key role in the newly released movie ``The January Man.'' In the film, a hot-shot, high-tech detective uses Agenda to solve a crime. Lotus is hoping that ``The January Man'' will popularize PIMs and Agenda in much the same way that the movie ``War Games'' glamorized computer hacking. By Alan Radding; Radding is a Boston-based author specializing in business and technology. <<<>>> Title : Monitoring the inaugurati Author : CW Staff Source : CW Comm FileName: inaugura Date : Jan 23, 1989 Text: WASHINGTON, D.C. _ The committee planning Friday's inauguration of President-elect George Bush is busy with more than simply planning and keeping track of the event and its associated pageants and parades. It is also selling souvenir license plates made at prisons. The license-plate operation springs up every four years and dates back to 1933, but this year the Presidential Inaugural Committee (PIC) is taking advantage of 36 personal computers donated by NCR Corp., according to Bill Johnson, who is in charge of logistics for the inaugural committee's marketing department. Back in the operations center, Chuck Williams, who became responsible for the center on Nov. 16 shortly after retiring from the U.S. Air Force, has a staff of seven to ensure that the 36 official inaugural events run smoothly. Second time around The Bush inauguration is the second for Williams, and it can be hectic at times, he said. When the PIC moved into its headquarters in November, it was nothing more than an empty warehouse. ``While these guys were putting together the hardware and software and trying to find a desk to set this equipment on, workers were building the walls around us,'' Williams said. In the marketing area, most of the microcomputers _ NCR's PC810 and PC916 models _ are used to handle orders and monitor trends in sales of inaugural memorabilia, including license plates. Twenty PCs are linked into a Novell, Inc. local-area network, Johnson said. In addition to using typical software packages such as Wordperfect Corp.'s Wordperfect 5.0 and Borland International's Paradox, the marketing department is using custom software from American Management Systems, Inc. (AMS), a firm based in Arlington, Va. Prison plates The AMS software handles order entry, processing, billing and label printing for the license-plate operation, drawing from a database in the Novell file server. Orders are sent by Panafax Corp. facsimile machines to two prisons assigned to make the commemorative license plates. In the operations area, the three PCs are hitched to monitors so that information can be viewed and captured. The group is connected through two Novell networks with other PCs throughout the building. One of the networks runs Directline from Must Software International, which is being used to update event information from word processing, graphics and database packages. Data from outside sources such as the National Weather Service is also fed to the server. Information is accessed through handheld remote control devices rather than keyboards, Williams said. So far, the computer systems have run smoothly in the high-pressure environment _ the staff works 14-hour days, seven days a week _ and 15,000 license plates have been sold, Johnson reported. By Alan J. Ryan and Mitch Betts, CW staff <<<>>> Title : News shorts Author : CW Staff Source : CW Comm FileName: short116 Date : Jan 23, 1989 Text: Virus bites term papers? The infamous Scores virus, which has caused Apple Computer, Inc. Macintoshes at numerous corporations, universities and government agencies nationwide to bomb in recent months, has popped up again. This time, the virus shut down a Mac in a student computer laboratory at the University of Oklahoma's Bizell Memorial Library, according to university officials. Several students lost data before the virus was swept off the Macintosh's hard disk drive with a vaccine program. To ward off further attacks, the library is stationing extra computers at the entrance of the computer lab. These machines will be used to test programs and data disks for viruses before they are used on computers in the laboratory, according to a spokesperson. Jason virus strikes again Meanwhile, last Friday, the 13th, brought unconfirmed reports of a virus attack on PC systems in the UK. According to published reports, the virus slowed deleted files; it reportedly hit a UK-based firm with over 400 personal computers. ``My gut feeling is that it is a modified version of the Israeli virus,'' said John McFee, president of the Computer Virus Industry Association in Santa Clara, Calif. The Israeli virus contained a time bomb designed to go off on May 13, 1988, the 40th anniversary of the last day of Palestine. ``I don't know if it is a virus or individual errors,'' said Harold Highland, editor-in-chief of Computers & Security. Highland has isolated the Israeli virus and said it resides on the PC's boot sector and should be easily discovered with the help of tools such as The Norton Utilities from Peter Norton Computing. MAI keeps Prime bid alive MAI Basic Four, Inc. last week extended through Thursday its $20-per-share bid for Prime Computer, Inc. A hostile takeover attempt from the start, the MAI/Prime affair has grown vitriolic in the past days, with allegations of injustice to shareholders whizzing across the wire and through the mail in both directions. As of last Thursday, MAI had garnered 64.8% of Prime's outstanding stock; under relevant Delaware law, the would-be hostile acquisitor needs 85% to bring off the buy. Ashton-Tate targets MIS Ashton-Tate Corp. last week unveiled a consulting and training group that is aimed at helping MIS departments implement new database architectures and communications systems. The group numbers less than 100, but it will be growing throughout the year. The focus of the group will be the Ashton Tate/Microsoft/Sybase SQL Server, Dbase/VAX and Dbase-to-host communications products. According to group head Barry Obrand, Ashton-Tate consultants will be able to recommend and work with non-Ashton-Tate applications. Fees will be determined on an hourly or project basis. Unisys moves up shipments Unisys Corp. said it will get its 2200/600 mainframe series out a month ahead of schedule with the delivery of one model to New Jersey Bell in February. The company also recommitted to a March shipment date for its smaller mainframes, the 2200/400 line, which were supposed to be out by the end of 1988. The company announced in September that the 400s would be three months late. Taken down a peg IBM may have enjoyed a comeback year in the mid-range marketplace and on the bottom line, but it slipped in the eyes of corporate America. After six years as Fortune magazine's most admired computer corporation and several more as the most admired U.S. firm, the industry giant fell behind Hewlett-Packard Co. and Apple Computer, Inc. among computer vendors. Digital Equipment Corp. dropped to fourth from second last year, while Amdahl Corp., a newcomer to the computer top 10, came in sixth. Electronic Data Systems Corp. remained the most admired diversified services firm, and Control Data Corp. stayed in the top 10 least admired companies in all industries. <<<>>> Title : On SQL Server's test trai Author : CW Staff Source : CW Comm FileName: 1sqlserv Date : Jan 23, 1989 Text: A glaring shortage of front-end development tools and the lack of a finished product has not stopped a handful of the U.S.' largest corporations from hatching SQL Server applications. SQL Server, announced one year ago by Ashton-Tate Corp., Microsoft Corp. and Sybase, Inc., is a multiuser database engine designed for the OS/2 operating system. It implements the so-called client/server architecture, under which a server handles data management while users' workstations provide the interface or front end. Users who are anxious to get cracking have been forced to buy the $1,995 Network Development Kit, which is essentially a beta-test version of the software. Of the 800 units sold, some 200 to 250 have been snapped up by corporations, according to Microsoft SQL Server product manager Dave Kaplan. System One Airplane Services, a subsidiary of Texas Air Corp., already has a prototype application running under SQL Server, and the results are promising. Curt L. Abraham, manager of applications development, moved over a database that formerly ran under Paradox, a database management system from Borland International. It took an hour to query the disk-based Paradox; the same query took one minute under SQL Server Abraham said. If all goes well, the subsidiary will provide travel agents with a variety of SQL Server-based systems, according to developers at System One. The biggest problem for SQL Server developers is the scant selection of tools. Several personal computer software vendors have promised to port their DBMSs, but so far none have shipped. Although SQL Server currently supports the C programming language, users will have to wait until this summer for Cobol, perhaps longer for other languages. In addition to the Excel spreadsheet as an interface, Microsoft is working on Omega, its graphical database that uses Basic as its development language. The product is in early testing at some sites but will likely not reach the market until late this year, sources reported. Eventually, graphical front-end tools from Sybase will be ported from workstations and minicomputers to SQL Server, Kaplan noted. Waiting for Cobol Travelers Insurance Co. has nearly completed a thorough SQL Server evaluation and has ported a production database from another PC-based local-area network for testing, said Mike Molyn, manager of distributed processing for Travelers' database administration group. Although SQL Server provides impressive performance gains, Travelers will hold off on production implementations until Cobol support is available. Molyn said it took only two hours to move the data to SQL Server. What could prevent widescale implementation of SQL Server at Travelers is IBM. If IBM enhances its OS/2 Data Manager into a robust and effective client/server system, ``we will use that instead,'' Molyn said. IBM is expected to provide this enhancement by the summer. The big advantage of an IBM offering is the eventual evolution of a distributed database system that would span IBM 3090 mainframes, Application System/400 minicomputers and Personal System/2 micros. ``That will never be supplied by Microsoft,'' Molyn noted. A diversified East Coast Fortune 100 firm is hard at work rewriting a tracking system to replace a similar application running under IBM's MVS. Unlike Travelers and System One, which have ported databases, this firm is building its application from scratch, using the unreleased SQL Server and Omega, the unreleased and unannounced front-end tool from Microsoft. The firm has big plans for SQL Server and is anxiously awaiting new versions of products such as Paradox to serve as a front end to the server. According to sources, Ralston Purina Co. is considering using SQL Server to help automate its manufacturing operations, and Covia Corp., a division of United Air Lines, is reportedly considering SQL Server to automate reservation systems. By Douglas Barney, CW staff <<<>>> Title : Solbourne Sun-4-compatibl Author : CW Staff Source : CW Comm FileName: sol Date : Jan 23, 1989 Text: LONGMONT, Colo. _ Japanese-backed startup Solbourne Computer, Inc. will announce today a multiprocessing superworkstation that claims full software compatibility with Sun Microsystems, Inc.'s Sun-4 workstation series. Solbourne uses Sun's Scalable Processor Architecture (Sparc). The company's goal is to establish the Sparc-based Sun-4 as the de facto standard workstation platform while offering current and prospective Sun users an alternative source, said founder and Chief Executive Officer Douglas McGregor. Analysts and early users last week said that the fledgling company could do just that. ``We're pretty bullish on Solbourne for several reasons,'' said David Burdick, an analyst at San Jose, Calif.-based market research firm Dataquest, Inc. Technology touted One reason, he said, is the technology itself. The machines include the eight-model, one- to four-processor Sun-4/600, which offers between 9.5 and 30 million instructions per second (MIPS) and is capable of 1.6 million to 4.7 million floating-point operations per second. By way of comparison, a two-processor Sun-4/602 equipped with 16M bytes of memory, a 327M-byte disk and a 150M-byte cartridge tape yields up to 17 MIPS for $51,400 _ bettering the performance of a similarly configured Sun-4/260 by 70% at a 14% price break, according to a Solbourne spokeswoman. ``We're very happy with Sun, but we'd like to stretch our limited resources as far as possible,'' said Mitchell Levin, director of the Computer Science Laboratory at Rensselaer Polytechnic Institute in Troy, N.Y. A ``fairly heavy Sun user'' and Solbourne beta-test site, RPI jumped at the chance to try an alternative with a price/performance advantage. The Solbourne workstation, used since October on a computer stereo vision application, ``has essentially done everything Solbourne promised,'' Levin said. Japanese electronics giant Matsushita Electric Industrial Co., which owns 52% of the company and is manufacturing the Sun-4, funded Solbourne's research and development effort to the tune of $11.75 million and is providing a $38.9 million round of startup funding. By Nell Margolis, CW staff <<<>>> Title : DEC products received wit Author : CW Staff Source : CW Comm FileName: decreact Date : Jan 23, 1989 Text: MAYNARD, Mass. _ If it were the Super Bowl, the announcer might say that in last week's desktop product blitz, Digital Equipment Corp. President and quarterback Ken Olsen uncorked a long game-winning pass into an end zone filled with waiting receivers. Trouble is, many of DEC's customers are still deciding whether they will _ or even want to _ catch the toss. Users contacted last week by Computerworld expressed a mixture of reserved optimism and lingering doubt as they crossed their arms and fingers and waited to see whether DEC could really do what it has boldly promised _ deliver a quick and total desktop package. More than a dozen hardware and software announcements constituted the largest introduction in DEC's 31-year history, including machines ranging from personal computers to a workstation capable of processing 14 million instructions per second. Die-hard DEC devotees said they thought the broad rollout was just what they have been waiting for. ``I like DEC, and I've been taking a lot of lip because Sun and Apollo are doing a better job in the workstation market,'' said an impressed Steve St. Ong, systems manager at Square D Co., an electrical equipment maker in Milwaukee. St. Ong said that his firm is interested in connecting the new DEC Vaxstation 3100 to its VAX-11/780 because ``we've got developers who like that machine a lot, and it's going to be tough to sell them on anything but VAX and VMS.'' Other DEC supporters chose to keep their cards a little closer to the vest, especially those with lingering memories of DEC's earlier ill-fated desktop products. ``We've been looking forward to it, but we've got to be cautious,'' said Peter Link, manager of engineering systems at Marathon Electric Manufacturing Corp. in Wausau, Wis., who was evaluating the high-end Decstation 3100. ``[DEC's past desktop failures] will make us think twice about buying. We're not going to jump in and say `They're here at last' without doing some careful thinking.'' DEC's past desktop failures have also made even the most devoted fans edgy. ``I'm a big DEC fan _ I've always had DEC equipment and probably always will _ but I would trust them more with medium- and large-size mainframes than what they would come out with for the desktop,'' said Jack Chambers, director of communications and information technology at Duquesne University in Pittsburgh. ``Even some of their own salesmen wouldn't recommend the Vaxmate to us.'' It is these devotees that DEC's marketing department needs to pursue most vigorously if it is going to attain the company's goal of becoming the No. 1 workstation vendor in the world. ``Our goal is to win the desk top,'' Vice-President of Low-End Systems Dom Lacava said. And they may have already made leaps in that direction, claiming at least 5,000 first-day orders for the new products. DEC may achieve at least part of its goal by simply maintaining accounts that have been hemorrhaging over to vendors such as Sun Microsystems, Inc., Apollo Computer, Inc. and Hewlett-Packard Co. ``DEC's customers could put up with its lousy [workstation] performance for only so long; then they looked elsewhere,'' said Steve Blank, head of marketing at Ardent Computer Corp. and cofounder of Mips Computer Systems, Inc., which makes the R2000 reduced instruction set computing microprocessor used in the new Decstation 3100. ``I think this could stop that sort of attrition.'' Still, DEC may face an uphill battle among even its loyal users. ``We're not overwhelmed, because there is nothing magic in the announcement,'' said Clark Lambert, director of data processing at the Kansas City Star in Kansas City, Mo., where a Vaxcluster supports more than 100 personal computers. By James Daly, CW staff <<<>>> Title : Stockade Author : Clinton Wilder Source : CW Comm FileName: 116stock Date : Jan 23, 1989 Text: Upstate New York is known for its tough winters, but Syracuse, N.Y.-based computer leasing firm Continental Information Systems Corp. (CIS) was battered by much more than the cold and snow last week. CIS stock, already trading at depressed levels from earlier company woes, lost more than 50% of its value in four days, plunging from 2 points to Thursday's close of 1 point, a new low for the year. Precipitating the drop was the company's revelation that it was late on an interest payment to a creditor and is seeking the sale of all or part of the company (see story page 12). CIS traded as high as 9 during the past year. Wall Street takeover rumors boosted the stock of financially stumbling Digital Communications Associates, Inc. DCA added 2 points during the week to close Thursday at 22 . The market reacted coolly to Digital Equipment Corp.'s splashy workstation announcements. DEC lost of a point on announcement day but ended Thursday with a net gain of of a point to close at 100 . IBM, in a strong week for the Dow Jones industrial average, rose 1 points to 123 . CLINTON WILDER <<<>>> Title : A Super Bowl tie? Author : CW Staff Source : CW Comm FileName: goode Date : Jan 23, 1989 Text: I've known Bud Goode for eight years. We've collaborated on a lot of Super Bowl stories. And let me tell you, Dick Butkus is no Bud Goode. Nor, for that matter, is Brent Musburger, Irv Cross or any of them. Bud stands alone among pigskin prognosticators. After all, he and his computer have picked the past two Super Bowl winners. And thus far, they are six out of eight in this year's National Football League playoffs. Even if they were last among this year's regular stable of 10 New York Post forecasters, they finished first in 1987. So this Bud's for me. Having said that, let me tell you that he threw me a curve when I called him for the lowdown on Super Bowl XXIII. Instead of boldly hemming and hawing and saying how you just can't count on the computer to pick the winner of a football game, he came right out with his forecast, naysayers be damned. A tie. Imagine that. A tie. No winner. Too close to call. I felt like telling Bud that the imperious, cigar-smoking editors at Computerworld would throw me off the sports beat if I came to them with a story calling for a tie in the Super Bowl. But I didn't. Instead, I said, ``OK, Bud, then what's your gut feeling?'' Before he answered, I heard the sound of rustling readouts. Or maybe it was the sports page. The latest line picks San Francisco by seven points. Finally, he spoke: ``I'm gonna take Cincinnati and the seven.'' There is, of course, a computer-based method to his madness. Each week during football season, he massages more than 170 variables per page three different ways as part of a 60-page readout he currently sells to the head coaches of nine NFL teams. And he loves the graphics he's able to produce with his new Hewlett-Packard laser printer. Bucking the trend in linebackers of late, Bud's been downsizing. His Franklin AT is the latest step in a genealogy of hardware that's seen him go through a Sperry 1100 and an IBM 4381. Offensive balance Turning to the 49ers-Bengals matchup, he talked about the statistical strength of both teams' offenses. Bud dedicates 50 variables to the clash of offense and defense. ``On the first 16 statistics, San Francisco has an advantage over Cincinnati's defense,'' he reported. ``And on the first 17 statistics, Cincinnati's offense has the advantage over San Francisco's defense.'' These statistics, amassed during the regular season and playoffs, include passing efficiency, fumbles allowed and yards per pass attempt. When Bud weighed out the whole mix with his PC, the outcome was so close it was a wash statistically. And he is loath to contradict it. ``I never want to go against the computer, because when I do, I'm wrong nine times out of 10,'' he noted. If the game is tied and time is waning midway through the fourth quarter, what does he predict? If the Bengals have the ball, they will try to eat up the clock by running fullback Ickey Woods behind their leviathan offensive line. If the 49ers have it, they will run and simulate the run with short, safe ``dump-off'' passes to running backs such as Roger Craig. Bud does not like to think his game has fallen off. And he has little respect for some of his emerging, computer-based forecasting competition. ``The last one I saw was in USA Today, and I burst out laughing,'' he said. ``I don't think they know what they're doing.'' If that's the way Bud feels, I agree. By Bruce Hoard, Special to CW <<<>>> Title : MIS shies away from secon Author : CW Staff Source : CW Comm FileName: pctrend Date : Jan 23, 1989 Text: The MIS executive's short personal computer list is getting shorter. Spurred by more sophisticated PC applications and the desire for control over their organization's PC proliferation, corporate information systems departments are narrowing their approved PC vendor choices. And the trend, which is expected to accelerate in 1989, will mean success on the corporate desk top for top-tier vendors IBM, Compaq Computer Corp. and Apple Computer, Inc. and tough times for lower priced compatible vendors such as Wyse Technology, Tandon Corp. and AST Research, Inc. The trend, MIS directors say, is spurred by the emergence of more powerful PCs as the platform for critical, make-or-break business applications once entrusted to minicomputers. ``If I'm using a PC as the central communications controller for a warehouse management system, I'm going to be very careful about where that machine comes from,'' said Richard Lester, vice-president of corporate development at Associated Grocers, Inc. in Seattle. Portland, Ore., steel maker American Industries, Inc. says it is happy with its turnkey Wyse system for computer-aided design and manufacturing applications. But for future purchases, MIS director Larry Potter's edict is IBM Personal System/2s or Compaq compatibles. ``I don't have a complaint in the world about Wyse,'' Potter said. ``They're just not one of the major players that I think about.'' Chris Kryzan, product marketing manager for PCs at San Jose, Calif.-based Wyse, responded that in large corporate accounts, there is room for the Big Three of IBM, Apple and Compaq, plus ``a fourth compatible, low-cost alternative. . . . Our strategy is to become that fourth brand.'' Financial analysts expect to see the trend reflected in computer industry fourth-quarter results to be announced this week. Apple and Compaq are expected to continue their stellar growth rates while AST and Wyse have already announced expected losses from battered profit margins and, for Wyse, declining revenue. ``The industry polarization is getting stronger,'' said Thomas Galvin, an analyst at Smith Barney, Harris Upham & Co. ``As users get more and more into the OS/2 and multitasking world, reliability and quality will be stretched to the limit. People feel more comfortable with the top-tier products.'' Comfort is tops At IBM 4381 shop Pacific Standard Life Insurance Co. in Davis, Calif., OS/2 has made Jerry Waters, vice-president and director of IS, into an IBM hardware maven. He says the key selling point for standardizing on the PS/2 is not technology or price, but comfort. ``It may not be the best way, but it's the easiest way to address our long-range strategy to migrate to OS/2 for distributed processing,'' he said. Waters noted that his shop is not all-IBM, with peripherals from Memorex Telex and Xerox Corp. Waters also exemplifies another trend toward shorter short lists _ the desire to simplify one aspect of an increasingly complicated job. ``I don't have enough people for someone to go through magazines'' looking for PC vendors, he said. ``I just want one that will do the job and still be in business in five years.'' Confusion about operating system software is also playing into the hands of the largest PC vendors. ``Users are trying to sort out Unix, OS/2, OS/2 Extended and DOS 4.0,'' said industry consultant Richard Shaffer, president of New York-based Technologic Partners. ``The more confusion there is in software, the more buyers try to deal with it by reducing the confusion in hardware.'' Bechtel Group, Inc. in San Francisco, with some 2,500 PCs corporatewide, is currently in the process of narrowing its PC vendor list. ``As you try to network all these systems together, you want to cut back on the number of variables, and one of those is the number of different suppliers,'' said H. William Howard, manager of information technology. By Clinton Wilder, CW staff <<<>>> Title : Correction Author : CW Staff Source : CW Comm FileName: correct1 Date : Jan 23, 1989 Text: Supercalc, a mainframe spreadsheet sold by Computer Associates International, Inc., was incorrectly termed ``the leading mainframe spreadsheet'' [CW, Dec. 5]. It is one of several products in that market. <<<>>> Title : Computer firms sign up fo Author : CW Staff Source : CW Comm FileName: tv Date : Jan 23, 1989 Text: SANTA CLARA, Calif. _ IBM, Digital Equipment Corp., Apple Computer, Inc. and Hewlett-Packard Co. are among 16 leading U.S. firms that gathered in Las Vegas last week and decided to bet on the U.S. in the coming wave of television technology. The companies are pitching in to draw up a business plan for a for-profit U.S. venture whose mission will be to garner ``a majority of the hardware markets associated with an anticipated U.S. high-definition television business,'' according to a statement issued by the American Electronics Association (AEA). High-definition television (HDTV), a next-generation technology that promises viewers clearer pictures and promises vendors bigger pictures on their balance sheets, is already under development by government-funded consortiums in Europe and Japan; the U.S., however, has been slow to enter the vaunted new market. The HDTV cause has become critical for both the U.S. military and electronics industries, which argue that the technology needed to develop HDTV is crucial to the further development of semiconductor and other electronic technologies. According to the AEA announcement, the limited partnership formed will aid the U.S. government in funding, developing and managing at least one HDTV consortium, licensing the resultant technology to other domestic companies, and directing technology to HDTV and related product manufacturing. By Nell Margolis, CW staff <<<>>> Title : NAS deal brings new lineu Author : CW Staff Source : CW Comm FileName: 1nasnew3 Date : Jan 23, 1989 Text: SANTA CLARA, Calif. _ Months of uncertainty for users of National Advanced Systems CPUs came to an end last week with National Semiconductor Corp.'s sale of 50% of its mainframe unit to Memorex Telex N.V. After dozens of rumors about the potential sale, the deal _ a total package estimated at between $300 million and $350 million _ appears to guarantee continuity for customers, at least in the near term. NAS management will stay in place, and no work force reductions are anticipated, a NAS spokeswoman said. But with Memorex Telex obtaining an option to buy out all of National Semiconductor's stake at an unspecified future date, further changes could occur. ``National obviously is headed in a different direction than they were before, but we're encouraged that they kept half the company,'' said Richard Lester, vice-president of corporate development at Seattle-based Associated Grocers, Inc., which runs a NAS AS/XL 60 and 90/60. ``We've done business with Memorex [for several years], and we're very comfortable with them.'' ``If there were any change, we probably wouldn't see it for six months or more,'' agreed Gene Robbins, assistant vice-president of academic administrative services at Queens College in New York. ``Only time will tell if our relationship with NAS will be status quo or turn negative later on.'' The college has a 3-year-old NAS 8023 mainframe and 16 NAS disk drives. It has used NAS machines since 1982. The partial marriage matches hard-charging, acquisition-minded peripherals veteran Memorex Telex with an IBM plug-compatible mainframe seller whose fortunes stagnated last year. NAS' performance was a major drag on National Semi's bottom line at times during the past year, and it was clear to users and analysts that the chip-making parent was seeking both an exit from the systems industry and needed cash. But NAS' uneven financial performance often made it the star performer when National Semi's other operations were unprofitable. NAS' performance slipped badly during the six months ended August 1988, but the second fiscal quarter ended Nov. 30 brought record bookings for CPU and disk-drive orders. NAS disk drive customer Michael S. Heschel, corporate vice-president of the Information Resources Division at Baxter Health Care Corp. in Deerfield, Ill., agreed that any change would come later on. ``We'll see what happens over the next six to 12 months,'' he said. ``I don't think even the two parties involved know exactly how it'll come out.'' Pass the hat Under terms of last week's agreement, National Semi will receive $250 million in cash and four million shares of Memorex Telex common stock, equal to 10% of the privately held company. The cash will not come directly from Memorex Telex but from a consortium of financial institutions raising the $250 million by pledging the assets of NAS, according to a NAS spokeswoman. ``This is just a transitional step to the merge yet to come,'' said Bob Djurdjevic, president of Annex Research in Phoenix. ``Ultimately, Memorex Telex will probably exercise their purchase option and buy the rest.'' If that happens, Memorex Telex will surpass Amdahl Corp. as the leader in total sales of IBM plug-compatible systems and peripherals, with annual revenue of about $3 billion. ``We want to have a major role in the full spectrum of products, either directly or indirectly,'' said George L. Bragg, executive vice-president of strategic planning at Memorex Telex. ``This arrangement will allow each company within the $3 billion entity to specialize in its own product area: NAS will provide the full range of CPUs and peripheral products, while Memorex Telex will provide complementary peripherals and communications products.'' There are interesting potential synergies regarding Japan's Hitachi Ltd., supplier of both NAS and Memorex Corp. products. Hitachi has supplied Memorex with solid-state drives for resale under the Memorex label. NAS sells the same solid-state drive as the NAS 7900 Semiconductor Disk. ``If [NAS is] joining forces with Memorex, the question would be whether they would begin to move away from the Hitachi product line and toward the Memorex product line,'' Baxter Health Care's Heschel said. Hitachi's philosophy ``has been to leave their options open and to let the product flow through multiple distribution channels,'' Djurdjevic said. In Europe, for example, there are three Hitachi CPU distributors _ Comparex, Ing C. Olivetti & Co. and NAS Europe. A Hitachi spokesman declined specific comment on the NAS-Memorex Telex deal. Memorex Telex's corporate headquarters in Tulsa, Okla., is within 100 miles of an 18-month-old Norman, Okla., Hitachi factory that produces disk drives for NAS. The 73,000-square-foot Hitachi factory is doubling its capacity this year [CW, Jan. 9]. By Jean S. Bozman and Clinton Wilder, CW staff <<<>>> Title : Muscular 68030 Mac SE to Author : CW Staff Source : CW Comm FileName: mac1 Date : Jan 23, 1989 Text: CUPERTINO, Calif. _ Apple Computer, Inc. will kick off what is expected to be a busy year for system introductions next week by coming out with a more powerful version of the Macintosh SE, its mid-range Macintosh system. The Mac SE/30, slated for a Macworld Expo debut, will be powered by Motorola, Inc.'s 68030 chip, according to sources. It will be offered in two models _ one with 2M bytes of random-access memory and a 40M-byte hard disk drive and a second with 4M bytes of RAM and an 80M-byte hard drive. Pricing is expected to be in the $5,000 range for the 40M-byte hard drive version and in the $6,000 range for the 80M-byte hard drive model. To the likely disappointment of Apple customers, Macworld will not see the introduction of the long-awaited Macintosh laptop, sources said. Apple Chairman and Chief Executive Officer John Sculley has promised a Mac laptop debut in 1989. The introduction of a Mac SE fulfills other promises made by Apple last fall when it introduced the 68030-based Macintosh IIX. At that time, Randy Battat, vice-president of product marketing, said the Mac SE and the Mac II would become the foundation of two distinct product lines. The Mac IIX, which debuted in September, was the first new Macintosh since the original Mac II and Mac SE were unveiled in spring 1986. Standard drive Battat also said that Super Drive, a 1.44M-byte flexible drive that can read and write Microsoft Corp. MS-DOS and OS/2 files, would become a standard Mac peripheral. Super Drive requires 4M bytes of RAM. It will be offered with the Mac SE/30. The Mac SE/30 retains the traditional Mac look, with the smaller screen incorporated into the base of the system. It offers a single expansion slot. The Mac IIX sports a larger, stand-alone screen and six expansion slots like its predecessor, the Mac II. By Julie Pitta, CW staff <<<>>> Title : HP plots Unix desktop por Author : CW Staff Source : CW Comm FileName: lmx2a Date : Jan 23, 1989 Text: Hewlett-Packard Co. will detail its strategy next week for integrating Unix minicomputers and workstations into OS/2 networks, according to sources close to HP. Using HP's Unix System V port of Microsoft Corp.'s OS/2 LAN Manager, users will be able to attach a Unix server to Microsoft OS/2- or MS-DOS-based workstations, the sources said. Those desktop users will be able to access Unix-based resources. It also provides Unix shops with more freedom to choose the workstation best suited to their needs and budgets. In addition, sources close to HP said the mini maker plans to beef up its work group and Starlan 10 product lines. HP representatives declined to confirm or deny the reports. The HP LAN Manager OS/2 server software is an adaptation of LAN Manager/Xenix (LM/X), a generic Unix port that HP is co-developing with Microsoft. It is expected to ship this summer. ``HP has recognized that if you can't own the desk top, you can at least attempt to control it via these types of network products,'' a source close to HP said. The mini maker's approach here is to attempt to create user dependency with its network access to multiple standards; it will position its Unix LAN Manager server software under the Advancenet umbrella. On the drawing board HP is expected to announce the following specifics: LAN Manager connectivity to the HP 3000 and 9000 minicomputers, which would allow these machines to function as OS/2 servers. The ability to integrate MS-DOS, OS/2 Standard Edition 1.1, Transmission Control Protocol/Internet Protocol (TCP/IP) and HP-UX, which is a superset of System V. Support for OS/2 LAN Manager's Named Pipes and Mail Slots application programming interfaces, IBM's Netbios and LU6.2, as well as Berkeley Sockets, software utilities based on the University of California at Berkeley's Unix. Services will include Arpa (which gets users into Digital Equipment Corp.'s Decnet, for example) and Berkeley Services, TCP/IP's Telenet Mail and IBM's Distributed Office Support System and Professional Office System. Despite being a charter member of the Extended Industry Standard Architecture group, HP will unveil both an IBM Micro Channel Architecture-compatible card and an IBM Personal Computer AT bus card for its Starlan 10 line. First announced by Microsoft last February, LM/X reportedly will enable a Unix System V-based server to service requests from workstations running under Microsoft Networks (MS-Net), MS-Net for Xenix and LAN Manager network software. The generic LM/X port runs on Intel Corp. 80386-based servers and under Microsoft's Unix System V/386, Release 3.2 operating system. However, it can be ported to other versions of Unix and other chips such as Motorola, Inc.'s 68000 family, a popular Unix platform, said Paul Sribhibhadh's Microsoft's director of Xenix marketing. By Patricia Keefe, CW staff <<<>>> Title : CIS lender woes could for Author : CW Staff Source : CW Comm FileName: cis Date : Jan 23, 1989 Text: SYRACUSE, N.Y. _ Dogged by liquidity woes, Continental Information Systems Corp. (CIS) is negotiating the sale of part or all of its business, a company official confirmed last week. Analysts who followed the company through its climb to the No. 2 status in the computer leasing industry said that CIS' problems, which one analyst called ``a tragedy,'' were particular to the company and not indicative of trouble in the industry at large. ``Actually, the industry is doing well,'' said L. Crandall Hays, an analyst at Robert W. Baird & Co. in Milwaukee. In fact, that potential is there for CIS, according to Theodore Levy, who follows the company for Rochester, N.Y.-based analyst Sage, Rutty & Co. However, neither CIS' $1.5 billion lease portfolio nor what Hays referred to as ``a huge cash flow'' may prove sufficient to stave off creditors that have grown skittish in the face of CIS' apparent inability to turn around problems stemming from its 1987 acquisition of arch-competitor CMI Corp. ``CMI had an admirable record _ its earnings went straight up from 1984 to 1987,'' Levy said. ``But even then, they were highly leveraged.'' Costs associated with the assimilation of CMI figured largely in CIS' reporting a multimillion dollar first-quarter loss last spring. The company's stock fell, and its creditors panicked, Levy said. ``If you're a leasing company and your creditors back away,'' he noted, ``you're in bad trouble.'' CIS' trouble may have come to a head last week, when the company defaulted on $3.8 million worth of interest due to The Prudential Insurance Company of America on a $110 million loan. CIS, a spokesman said, had filed for a waiver of the payment before the 10-day grace period expired last Tuesday. The spokesman declined to give details of the negotiations except to confirm that CIS is contemplating a whole or partial sale, that the possibility of an equity investment by an unnamed investor or investor group is also under discussion and that CIS does not believe that filing for protection under Chapter 11 of the U.S. Bankruptcy Code is the route to take. By Nell Margolis, CW staff <<<>>> Title : OSF details user interfac Author : CW Staff Source : CW Comm FileName: motif2 Date : Jan 23, 1989 Text: The Unix tables have turned once more. The Open Software Foundation (OSF) has apparently gained renewed credibility with its selection of a user interface, leaving AT&T supporting an interface that differs from what is likely to be the standard look and feel. The reaction among Unix vendors and observers was generally positive last week, as the OSF explained its rationale for choosing its graphical user environment, called OSF/Motif. The OSF said it chose the Presentation Manager appearance offered by the joint Hewlett-Packard Co./Microsoft Corp. interface to facilitate the migration of personal computer users to Unix systems. Changing their tune Analysts, once skeptical of the OSF, applauded the action. Amy Wohl, president of Bala Cynwyd, Pa.-based Wohl Associates, said the OSF was successful in accomplishing a technically and politically difficult task in a short amount of time. ``I didn't believe they could do it, but they managed to pull it off,'' she said. The OSF's decision to adopt the Presentation Manager look and feel is expected by some analysts to establish that screen behavior as the dominant one in the industry. Tom Kucharvy, president of research firm Summit Strategies, said it was a foregone conclusion that Presentation Manager will be the standard user interface look, excluding the Apple Computer, Inc. Macintosh market. Both DEC and HP have indicated that their separate user interfaces, Decwindows and New Wave, will evolve to comply with OSF/Motif, and some observers ventured that AT&T would be forced to do the same. Indeed, an AT&T spokesman stated that Open Look may not be bundled with the next release of Unix System V, although developers' tool kits will be available sometime this quarter. OSF/Motif reportedly will be available midyear and will be offered separately as well as bundled with OSF/1 when that becomes available. The OSF said it expects the interface to be ported to other environments such as Unix System V and proprietary operating systems. The OSF also revealed what are considered to be generous licensing terms to make the product appealing to software developers and computer manufacturers. In addition to a $1,000 charge for a source license, computer vendors are subject to a $40 binary license per unit that Motif is used with, and software vendors likewise must pay a $10 runtime license fee for each software application Motif is bundled with. There are volume discounts applied to both the runtime and binary license fees, and the costs will most likely be passed on to the end customer by vendors. By Amy Cortese, CW staff <<<>>> Title : Net management star of Co Author : CW Staff Source : CW Comm FileName: comnet Date : Jan 23, 1989 Text: WASHINGTON, D.C. _ Network management will once again hog the limelight with high-speed host networks as the other major attraction at this year's Communications Network conference. Among the Comnet show's network management high spots will be Cincom Systems, Inc.'s introduction of Expert System Foundation, development tools for automating the management of IBM Systems Network Architecture networks through Cincom's Net/Master. The offering was designed to help network managers lacking programming skills to implement their knowledge in automated network management procedures, said the firm's senior product manager, Vicky Duckworth. A Group 40 IBM MVS-based version of the product is priced at $10,000. Cincom also announced Net/Stat, a rule-based system driven by the changing status of network devices rather than by events, Duckworth said. Other announcements at the Feb. 5-8 show should include the following: AT&T is expected to roll out major components of its Unified Network Management Architecture, which reportedly will provide integrated monitoring, troubleshooting and diagnostics across a wide variety of AT&T voice and data networking equipment and services. According to one industry source, the announcement, which will take place either at Comnet or the week before, will include a link to IBM's Netview and to Cincom's Net/Master. Start-up company Clear Communications Corp. in Deerfield, Ill., is expected to announce Clearview, a network management system that monitors carrier lines and generates performance reports based on a variety of parameters, company spokesman Robert Copithorne said. The system, which can be interfaced with T1 equipment vendors' network management systems, arms managers with the information they need to anticipate network failures, correct carrier billing and compare competing services, Copithorne said. Two products are expected to vie for the title of first channel-attached mainframe network based on the Fiber Distributed Data Interface (FDDI) standard. Fibronics International, Inc. in Hyannis, Mass., will announce its FX8222 IBM-to-FDDI Controller. The product will link the channels of IBM and IBM-compatible mainframes over an FDDI local-area network, a company spokesman confirmed. In data streaming mode, the link will handle speeds of up to 24M bit/sec., the company said. Integrated Networks Corp. in San Diego will announce Fibertalk 3000 Channel Interface Unit, an FDDI-based network that can support full host channel speeds of 4.5M byte/sec., or approximately 35M bit/sec., over distances of up to 1.2 miles, a company spokesman said. The units are said to support IBM mainframes and compatibles as well as IBM cluster controllers. A third channel-based networking product introduction is expected from Data Switch Corp. in Shelton, Conn. The Channelnet Model 9455 Channelplexer is said to extend up to four IBM or compatible host channels over a 45M bit/sec. T3 circuit at speeds up to 3M byte/ sec. The product is said to save money for users by supporting up to four links between host channels and high-speed peripherals over the same T3 link. By Elisabeth Horwitt, CW staff <<<>>> Title : DEC pins hopes on connect Author : CW Staff Source : CW Comm FileName: decredux Date : Jan 23, 1989 Text: Can the No. 1 desktop vendor be the one that offers mainly connectivity _ not hardware or software? Digital Equipment Corp. believes the answer is yes, and it said last week that in a year or so, that strategy, anchored by Decwindows, will carry the company to the top of the desktop heap. DEC announced that Decwindows, already shipping in DEC's Ultrix 32 Version 3.0, will be included in its Release 5.1 of VMS, due to ship in February. In preaching the connectivity gospel, DEC President Kenneth Olsen reiterated a position he has expressed numerous times previously: that DEC would not innovate standard desktop hardware or software but would tie disparate desktop devices together. Such a strategy is far from new. In 1986, the company gave birth to the Vaxmate, a Microsoft Corp. MS-DOS system that had built-in communications capabilities. That system failed to capture the market's imagination, however. But if connectivity alone will propel DEC, much of the required software is not there yet and will not be announced until later this year at the earliest. Indeed, much of what was described last week remains only conceptual for now. For example, ``live link,'' a capability of DEC's Compound Document Architecture (CDA), can update a document in one user's application from a document in another user's application. But two software packages offering the capability, Decwrite and Decdecision, will be not be available until May. The ultimate goal of CDA is to offer instant updating of a document from information in remote databases and spreadsheets, but no databases or spreadsheets now comply with CDA specifications, according to DEC officials, who gave no date for when the products will. However, 15 independent software vendors have endorsed CDA and are said to be developing CDA-compliant software. Another missing building block is MS-DOS-based Decwindows, which is not offered on a stand-alone personal computer. Instead, it will only be offered on a server for PCs, effectively turning the PC into a dumb terminal. A user would have to exit the MS-DOS application to operate in Decwindows from the server. MS-DOS Decwindows Display Facility, which would allow this, has no price or shipping date. ``I've got problems with MS-DOS not being on a PC. A user with a lot of PCs on a LAN may not want a server,'' said Frank Dzubeck, president of Network Communications Architects, Inc. in Washington, D.C. However, Richard Treadway, Decwindows program manager at DEC, said the company considered bringing Decwindows to MS-DOS PCs but did not because of the large amounts of added memory that would be required. In addition, the X-library of applications would be difficult to implement in MS-DOS. ``We didn't think there would be enough demand,'' Treadway added. Under VMS, the recommended minimum amount of memory in a Vaxstation to run Decwindows and several applications is 4M bytes. Under Ultrix, the recommended minimum is 6M bytes. In a VMS-based server, the recommended amount is 8M bytes, according to DEC. By Stanley Gibson, CW staff <<<>>> Title : Inside lines Author : CW Staff Source : CW Comm FileName: liner116 Date : Jan 23, 1989 Text: The low end without Lowe. IBM will hold a product blitz this spring for its low-end systems, its first major announcement after the departure of former low-end systems president William Lowe, according to sources. The PS/2 line is to be enhanced at almost every price point, and a PS/2 incorporating the Intel 386SX microprocessor and AT bus should be introduced as well. IBM will make available bus mastering cards that were demonstrated at Comdex Fall '88 and is also expected to announce third-party bus mastering cards. The Armonkee is also expected to announce applications developed by third-party vendors for PCs running OS/2 to be sold under the IBM label. Listen to what we say, not what we say. A Radio Shack spokeswoman apparently misspoke when she was quoted here last week saying that DEC's OEM versions of Tandy's PCs are ``Digital's in name only.'' Ed Juge, director of marketing for Tandy, last week said the comment was unfortunate and inaccurate. As with any OEM customer, DEC is free to ``add or subtract features,'' and it would be surprising if the company merely relabeled the Tandy boxes, he said. What is IS? The desire to polish the image and expand the role of the computer management profession meant changing the department name from data processing to management information systems and lately to just ``information services.'' Could that lead to confusion? Well, if you spotted the newspaper help-wanted ad for a ``Director of Information Services'' at Merck & Co., it may have. According to the fine print, IS in this case is actually (shudder!) public relations; the top computer executive at Merck is the vice-president for ``computer resources.'' No show, no refund! One notable no-show at Comnet will be DEC, which canceled a 2,000-square-foot booth too late to save its $56,000 fee, according to company spokeswoman Pam Lattimer. A company-wide reevaluation of marketing strategy resulted in the decision to focus more on DEC-sponsored shows such as Dectop and Decworld, she added. DEC will not be exhibiting at the Interface conference in March, Lattimer said, and the company may also bow out of the Comdex/Spring '89 show, rumor has it. Follow the leader. Don't be surprised if Texas Instruments surfaces this spring with a 4M-bit/16M-bit token-ring dual chip set. A source says third parties have been beating up TI over this issue, given that IBM's chip supports these capabilities. IBM is also expected to come up with a very cheap, plain-vanilla 4M-bit Token-Ring board, according to a source. Slip, slidding away. In a February 1988 announcement, Microsoft slated it's Unix port of OS/2 LAN Manager, LAN Manager/Xenix (LM/X), for an early 1989 release, preceded by a software developer's kit that was to ship by the end of 1988. In an update, Paul Sribhibhadh, Microsoft's director of Xenix marketing, says early LM/X code has shipped but that the developer's kit will be three months late. It will not be available until Uniforum in late March. The kit's release was delayed in order to incorporate (TCP/IP) transport software, he said. Open and shut. The recently announced graphical user interface by the Open Software Foundation (OSF) left out a key piece that was widely expected to be included, Adobe's Display Postscript imaging model. Apparently, Adobe did not agree to the OSF's licensing terms, and a suitable arrangement was not able to be worked out in time for the Dec.30 announcement. However, both parties confirm they are still negotiating. With no snow in the Northeast, Massachusetts high-tech execs are apparently thinking summer. Jim Manzi, chairman of Lotus, according to published reports, paid $4 million for a beachfront home on Cape Cod. Apollo Chairman Thomas Vanderslice is going downscale, reportedly negotiating for a $2 million cottage on the Cape. If you've got any reason to believe these guys or any other execs are thinking retirement instead of summer, give the hot line a call at 800-343-6474 or 508-879-0700 and tip off News Editor Pete Bartolik. <<<>>> Title : Telecom and MIS Author : Elisabeth Horwit Source : CW Comm FileName: teltrend Date : Jan 23, 1989 Text: Ever since divestiture, telecommunications has played an increasingly crucial role in the competitive plans of Fortune 1,000 firms and other large users, according to a study by The Eastern Management Group. Of the MIS and communications managers surveyed at 12,000 organizations, 27% said that they anticipated telecommunications personnel and budget outlays to grow between 10% and 50% during the next five years. Some 13%, representing the largest companies, anticipated a growth of 50% to 100%, the study found. Telecom executives said they expected their budgets' percentage of overall corporate budget to grow from 2.8% to 3.8%, on average, by 1993. In contrast, data processing/MIS managers predicted a growth of from 5% to 5.7% for their departments during the same period. A reflection of telecommunications' increasing strategic importance is the fact that in 1988, three-fourths of major companies had placed the function under DP/MIS, the report said. This also points to increased integration of voice and data operations, according to the study. Before divestiture, 42% of the study group had housed telecommunications within administration and human resources. The study also found an accelerating trend of decentralization of telecommunications and information systems operations, tying these departments' operations more directly to individual business units. ELISABETH HORWITT <<<>>> Title : Court: States may tax net Author : CW Staff Source : CW Comm FileName: 1scotus Date : Jan 23, 1989 Text: WASHINGTON, D.C. _ A U.S. Supreme Court ruling last week will allow states to tax interstate voice and data traffic. With several states having already enacted such taxes and others desperately seeking new sources of revenue, the decision could raise the cost of business communications and affect the location of data centers. The decision upheld an Illinois law imposing a 5% excise tax on interstate voice and data transmissions that begin or end in the state. Approximately a dozen states have similar taxes, and more states are expected to follow suit. Axed with taxes Illinois collected $142 million in taxes by mid-1987 and continues to collect revenue at a rate of $10 million per month from business and residential users, according to a court filing. Business network managers, especially those with high-volume transaction networks such as airline reservation systems, may wind up moving their data centers to ``tax haven'' states that do not impose telecommunications taxes, according to experts. ``This will definitely affect site-location decisions,'' said Kenneth L. Phillips, vice-president of telecommunications policy at Citicorp in New York and chairman of the Committee of Corporate Telecommunications Users. Some states will jump on the tax bandwagon, Phillips said, while others may choose to become tax havens in order to lure businesses to their states, similar to the way that Brussels is considered a ``data haven'' in Europe. State and city governments have enacted a variety of taxes on the fast-growing business of telecommunications equipment and services, including private lines, according to August H. Blegen, executive director of the Association of Data Communications Users in Bloomington, Minn. ``That is absolutely uncalled for and runs opposite to the direction legislators ought to be going,'' he said. But state governments, facing severe budget pressures, are turning to taxes on high-tech operations to boost revenue. Massachusetts Gov. Michael Dukakis, in developing a $604 million tax package announced last week, included such a provision in the wake of the Supreme Court's ruling, according to local reports. The Supreme Court decision, in the case of Goldberg v. Sweet, amounted to an endorsement of the Illinois Telecommunications Excise Tax Act of 1985. It taxes interstate telecommunications that begin or end in the state and are charged to an Illinois address. The law provides a tax credit if the taxpayer can prove that another state has taxed the same call. The high court unanimously rejected arguments that the state tax is an unfair burden on interstate commerce, which is protected by the U.S. Constitution's commerce clause. Tax dancing An Illinois trial court ruled that Illinois was unfairly trying to tax the entire cost of an interstate act, which takes place only partly in Illinois. But the lower court was overruled by the Illinois Supreme Court and on appeal at the U.S. Supreme Court last week. The high court downplayed the prospect of multiple taxation and ruled that the Illinois tax is fairly apportioned. ``Its economic effect is like a sales tax, the risk of multiple taxation is low and actual multiple taxation is precluded by the credit provision,'' the opinion said. The court acknowledged that interstate communications is not a local event but said it was reasonable for Illinois to tax calls originating or terminating there because it is virtually impossible to trace or separate each state's role in an interstate call. Because of computerized switching, the court said, ``the path taken by electronic signals is often indirect and typically bears no relation to state boundaries.'' The plaintiffs were two Illinois residents and GTE Sprint Communications Corp., the predecessor of US Sprint Communications Co., which is required to collect the tax because it is a retail provider of communications services. By Mitch Betts, CW staff <<<>>> Title : Chip prices deop; PC pric Author : CW Staff Source : CW Comm FileName: 1pchip Date : Jan 23, 1989 Text: While several vendors used memory costs as an excuse to hike personal computer prices last year, no one is yet promising to bring them down now that the shortage is over. According to memory-card product vendors and industry analysts, dynamic random-access memory chips are now increasingly available, and prices are dropping. Prices of 1M-bit chips are projected to plummet more than 50% by year's end. Already, the volume purchase price of 1M-bit chips has dropped to less than $20 per chip. When the memory crisis peaked in June, the price hovered at $50 to $60. Semiconductor price trackers now project that chip prices will fall below $10 this year. But vendors that raised system prices during the dearth _ including Apple Computer, Inc., AST Research, Inc., Hewlett-Packard Co., Sun Microsystems, Inc. and Digital Equipment Corp. _ have not committed to bringing those prices down again. ``We just can't make that kind of promise,'' a DEC spokesman said. That sentiment was expressed by a half-dozen other systems vendors. With the exception of IBM, which manufactures much of its own 1M-bit memory components, an aggregate system price erosion at an average of 12% to 15% is in the offing, said John Dunkle, vice-president of Work Group Technologies, a PC market research group based in Exeter, N.H. However, users will have to wait for a ``ripple effect'' to occur, he added. ``I expect the market to reflect the DRAM price drop in the next couple of months,'' said Jonathan Nareff, data center manager at Combustion Engineering, Inc. in Windsor, Conn. PC prices should come down, but not by much, said Fred Stuart, director of corporate information services at Phelps Dodge Corp. in Phoenix. ``There is still too much high demand for high-memory systems like the PS/2,'' he said. But because memory is often loaded at the retail level, PC prices can drop quickly, said Jim Ashbrook, senior vice-president of marketing at AST Research in Irvine, Calif. Memory installed by the retailer could decrease almost in sync with DRAM price cuts. At the same time, it takes up to 120 days from the time memory is purchased to the time a change is reflected in the base system price, Ashbrook said. Bruce Grant, vice-president of technology at Microage Computer Stores, Inc. in Tempe, Ariz., charged that it will be up to the vendors to show a good-faith price reduction. They ``now have to pay a price for making a show of the memory price increase,'' he said. Memory prices will take a dramatic dive this year for three reasons, said Drew Peck, semiconductor analyst at Donaldson, Lufkin & Jenrette, Inc., a technology research group based in New York. First, 256K-bit memory chip manufacturing lines will have been successfully converted to build 1M-byte chips by the middle of the year, he said. Second, semiconductor manufacturers that have ramped up at breakneck speed will now glut the market, resulting in a price war. Finally, many vendors devastated by the shortage will overstock memory to avert any future crises. By William Brandel, CW staff <<<>>> Title : More from DEC blitz Author : CW Staff Source : CW Comm FileName: decside Date : Jan 23, 1989 Text: DEC also disclosed the following last week: VAX PC, a Microsoft Corp. MS-DOS emulator for VAXs, particularly the Vaxstation 3100, will be available in March 1989 at a single-user charge of $500. A future release of VAX PC will include DEC's VAX/VMS services for MS-DOS. That will allow Vaxstation and PC users to share MS-DOS-based data, applications, disks and printers, said Richard Treadway, Decwindows program manager. An Ultrix version is scheduled to come later. Several components in its Vaximage program, including the Vaximage Scanning Subsystem, a $5,500 scanner for VMS workstations. DEC also announced Vaximage Scanning Application, a $600 package for creating image files. Vaximage Application Services is a set of software tools for developing image applications. It is priced at $1,200 for a single-user license. The products are scheduled to be available in February. Applications written for Ultrix cannot run on the reduced instruction set computing Decstation 3100 without some porting. Out of roughly 1,000 Ultrix applications, 20 have been ported to the Decstation 3100. Ultrix will eventually be offered in a version that includes symmetrical multiprocessing. Until then, the Vaxstation 3520 and Vaxstation 3540 will be able to run Ultrix only in asymmetrical multiprocessing mode. <<<>>> Title : Unisys microsizes it Author : CW Staff Source : CW Comm FileName: unimicro Date : Jan 23, 1989 Text: Unisys Corp. will bring its mainframe down to the desk top this week with the announcement of a microcomputer-size extension of its A series mainframe architecture. Scheduled for a Wednesday announcement, the desktop system, which sources said has been dubbed the Micro A, will establish a new mainframe entry point for Unisys. It is intended to lure potential small-business customers and to prevent existing A series customers from looking elsewhere for an intelligent desktop device, industry observers said. Sources said last week that the system will have a stripped-down price of about $20,000. With the added cost of software and peripherals, the system will actually be in the $40,000 range. That pricing scheme puts the system in a league with the low end of IBM's Application System/400. A+ series A Unisys spokesman confirmed that the system is an extension of the A series, which comes from the Burroughs side of the house. He said the micro-mainframe is based on proprietary A series technology that packs 10 million transistors onto a 2-in. chip, known as the Scamp. The system, although desktop-size, will have ``all the traditional attributes of a mainframe'' such as extended memory, advanced communications capabilities and the ability to drive substantial I/O, the spokesman said. One A series user said the system will have eight data communications lines. It will also be fully software compatible with the other members of the A series line, which operate under the MCP/AS operating system, the spokesman added. The system will likely run Unisys fourth-generation languages Linc and Mapper, a source added. Analysts contacted last week said the Micro A is a sound move because it responds to users' demands for more power on the desk top. ``This will help offload things from the mainframe and match applications to the processing power available,'' said Jeffry Beeler, an analyst at Dataquest, Inc. Michael Geran, vice-president of research at Nikko Securities Co. International in New York, said the company decided to implement a desktop system on the A series side before doing so with the former Sperry line because it was behind schedule with other smaller Sperry systems. Also, the company determined that there would be more demand on the A series side since these models ``are smaller to begin with so customers are more interested in price,'' Geran said. Robert Moran, Computerworld's Mid-Atlantic correspondent, contributed to this report. By Rosemary Hamilton, CW staff <<<>>> Title : Boomerang Author : Nell Margolis Source : CW Comm FileName: 2stock12 Date : Jan 30, 1989 Text: The late Sir Isaac Newton got it just about right: When an apple falls on you, you are likely to discover gravity. Microsoft Corp., Compaq Computer Corp. and IBM were among the computer companies feeling a sense of gravity around the middle of last week, when price rollbacks and a slip in gross margins announced by Apple Computer, Inc. set off a wave of insecurity in the usually robust microcomputer market. By Thursday, however, IBM's own fourth-quarter earnings report powered the stock back to 124 _ a -point increase over 123 at the week's start. Similarly, Microsoft announced strong December quarter sales and profits; its stock closed on Thursday up of a point, at 53 . Compaq preannounced a fourth-quarter triumph and saw its stock pick up 1 points from the week's start, closing Thursday at 63 . Digital Equipment Corp. turned in a disappointing fourth-quarter profit performance. Rising revenue and overall great expectations based on recent and imminent product debuts, however, fueled DEC stock to a 104 Thursday close _ a gain of 5 points from the week's start. NELL MARGOLIS <<<>>> Title : Bankruptcy cloud shadows Author : CW Staff Source : CW Comm FileName: 2cis2 Date : Jan 30, 1989 Text: Mid-size computer leasing companies ``either have to grow or die.'' _ CIS Chairman Harry E. Goetzmann Jr. [June '87] SYRACUSE, N.Y. _ For Continental Information Systems Corp., bigger was most assuredly not better. CIS' Jan. 13 filing for protection under Chapter 11 of the U.S. Bankruptcy Code evoked dark memories of the Itel Corp. bankruptcy of the early 1980s. But it is not expected to cast a cloud over the entire leasing industry. Rather, the CIS scenario highlights how fast things can turn sour after a major acquisition _ especially in a fiercely competitive industry in which reputation and creditworthiness are tantamount to survival. Now, with the taint of bankruptcy protection, the second-largest U.S. independent computer lessor will face its biggest challenge yet in assuring computer lessees that it will be a viable player. ``I'm not going to consider them [for future business] _ no way, shape or form,'' said Martin Phipps, operations officer at Baltimore-based Provident Bank of Maryland, which has current CIS leases for an IBM 3083 CPU and a 3380 and several 3330 disk drives. ``People are scared to death.'' Nonetheless, CIS lessees should see little change in their current leases, although their lease payments may go to a bankruptcy court trustee or other party. Many lessees are already paying directly to a CIS lender, as CIS sold its equity in many deals during the past few months to help ease its cash flow problems. ``There really should be no impact on the customer at all,'' said L. Crandall Hays, who follows CIS for Milwaukee-based investment firm Robert W. Baird & Co. CIS sought to reassure its customers that little should change, at least in the near term. ``It is our intention to do business just as we have before, and the Chapter 11 filing affords us the time to restructure,'' said Paul Brooks, CIS vice-president of corporate communications. ``I think we have a lot of loyal customers and believe our sales force is capable of maintaining those relationships.'' The seeds of the CIS downfall were planted in the company's bold 1987 buyout of archrival CMI Corp. in Bloomfield Hills, Mich. CIS vowed that the $160 million deal would create a leasing giant that would rank near IBM Credit Corp. and Comdisco, Inc. in strength, stability and success. Instead, the deal created a highly leveraged business that, when it began to stumble, created a chain reaction of stockholder, lender and customer doubts, all feeding off each other. ``When you lose your credibility, all those things happen very quickly,'' said Richard Kazan, chairman of Capital Associates, Inc., the third-largest U.S. independent lessor. ``It just shows you how fragile that credibility is.'' Some lessees saw a difference in service and responsiveness very soon after the CMI buyout. ``They spread their people so thin that it wasn't possible to maintain the same level of contact'' with customers, said Rick Mudrow, purchasing coordinator at Pacific Telecom in Vancouver, Wash. ``They gobbled up too much portfolio for the cash flow they had and couldn't make the pieces fit.'' The CMI acquisition strategy worked for a while, as CIS generated record revenue and profits in the first six months after finalizing the acquisition in August 1987. CIS even felt confident enough to add two smaller acquisitions for a total of $5.2 million in that period. But things began to unravel in CIS' 1989 fiscal year, begun March 1, 1988. The biggest problem was the $110 million former CMI debt that CIS agreed to take on as part of the acquisition. To service that debt, ``CIS needed to increase both business volumes and margins,'' said Thomas J. Donovan, director of investment banking services at Framingham, Mass.-based market research firm International Data Corp. ``The 1988 market did not provide the opportunities,'' he said. Many lessors felt the pinch in 1988 as a newly aggressive IBM Credit quoted low rates that cut profit margins to the bone [CW, Oct. 10, 1988]. In addition, Donovan said, leasing customers expected IBM's high-end 3090 models much earlier in the year and dramatically slowed their leasing activity until the 3090 S series began volume shipments in the fourth quarter. CIS also suffered from mounting internal woes related to CMI, including incompatible computer systems. According to a source close to CIS, the company has not been able to integrate its Hewlett-Packard Co.-based information systems with CMI's IBM-based applications. CIS had acquired CMI without CMI's top management, which lost a court fight to block the sale. Several top executives immediately formed competitor Encore International, Inc. and eventually hired CMI founder Edward Cherney. Although many ex-CMI employees stayed with CIS, several key marketing managers went to Encore International. CIS may have the potential to satisfy its creditors and exit Chapter 11, but its future beyond that remains highly uncertain. A deep-pocketed acquirer may be the only answer to restore the credibility that lessees demand. ``In any lease transaction, we take a jaundiced view of any company that appears to be having financial difficulty,'' Pacific Telecom's Mudrow said. By Clinton Wilder, CW staff <<<>>> Title : VAX power curve due for o Author : CW Staff Source : CW Comm FileName: 2decprod Date : Jan 30, 1989 Text: Digital Equipment Corp. will add processing power to the heart of its VAX family tomorrow, introducing a series of machines that will collide with existing product lines and put a price/performance squeeze on the high-end VAX 8800 models, sources close to the company said. The multiprocessor-based 6300 line will be fired by an improved CVAX microprocessor that can handle 4.2 million instructions per second, making it more than 30% faster than the 6200's 2.8-MIPS processor, said John Logan, executive vice-president of the Aberdeen Group, a Boston-based market research firm. The machines will also offer the added capability of placing as many as six processors into a single system, he said. Sources expect the 6300 models to be only 5% to 7% more costly than the popular 6200 series; base prices for the four-member 6200 line range from $175,300 for the 6210 to $556,600 for the 6240. If these indications prove true, the 6300 could discombobulate the multiprocessor VAX line by putting intense price/performance pressure on the 8800 series; like the 6200, the 8800s consist of one to four VAX 8700 processors and are capable of performing symmetrical multiprocessing. A top-of-the-line six-processor 6300, for example, could offer more than 20 MIPS for around $600,000, while the 8840 offers 22.2 MIPS for $1.5 million. The Maynard, Mass.-based firm would most likely put the 8800 models on life-support by cutting their price, Logan added. DEC will also reportedly offer 6200 users the option of swapping their old processors for the new models for a cost of about $8,000 per processor. The 6200 has been a popular member of the VAX family since it brought symmetrical multiprocessing to the heart of the VAX line upon its introduction last April [CW, April 25]. Observers say you can't blame DEC for sticking with a winner, even if the chance exists that the more powerful 6300 will cannibalize some of the 6200's sales. ``The company has to stimulate its U.S. business, so why not embellish a product that is already moving?'' said Barry F. Bosack, an analyst at the New York-based Robert Fleming Securities Ltd. research house. Users say the increased power and potential of the new models make upgrading an attractive option. ``If you compare the cost increase with the performance increase, we'd be fools if we didn't upgrade,'' said Gene Burson, manager of the 6210-equipped Toledo Edison Co. in Toledo, Ohio. ``We see the 6300 as nothing more than a welcome extension to the 6200.'' The 6300 series also addresses mid-range pressure from IBM's 9370 and Application System/400. ``DEC has realized that they have to respond to inroads that IBM has made with the AS/400, and the best way to do that is beef up its own mid-range,'' said Terry Shannon, director of the DEC Advisory Service arm of International Data Corp. in Framingham, Mass. ``Better that than continue to milk sales of the 8800, which isn't doing too well anyway.'' Observers said the 6300 could go a long way in finishing off the 8800 series, which has been the victim of lackluster sales and faces future internal pressure from above. DEC scientists are reportedly working feverishly on an air-cooled high-end uniprocessor, code named Aridus, that can process 15 to 20 MIPS and is expected for release this summer. The move is also seen as boosting the performance of its on-line transaction processor (OLTP) VAX line, as symmetrical multiprocessing is better suited than a uniprocessor to OLTP applications. One-two punch Analysts described the upcoming rollouts as the latter half of a one-two product punch from DEC. ``Their recent desktop product announcement focused so much on the low end that they need to put some attention back on the mid-range,'' said Bob Randolph, director of program services with Technology Financial Services, Inc. in Westford, Mass. ``So the message will be: `Dear customer: We are viable at both ends of our product line.' '' The 6300's introduction will also follow DEC's mid-range modus operandi of offering a mid-life kicker within a year after a product's release. The 8600's introduction in November 1984, for example, was followed by the 8650 in late 1985. Tomorrow's announcement will probably also include a server version of the 6300 based on the one- and two-processor models, sources said. Such a machine could tie in with their recent desktop product blitz [CW, Jan. 16], providing a mass data storage depository and communications controller node for the desktop machines. By James Daly, CW staff <<<>>> Title : ...while Wang aims higher Author : CW Staff Source : CW Comm FileName: 2newwang Date : Jan 30, 1989 Text: Wang Laboratories, Inc. plans to announce a high-end minicomputer next week _ its first offering in the double-digit MIPS category. But analysts contacted last week said the system is still not enough to give the firm the big boost it badly needs. A company official confirmed last week that Wang will introduce the VS10000, a uniprocessor system estimated to run at 12 million instructions per second, which is roughly four times the performance of its current high-end machine, the VS7310. Despite this leap in performance, analysts contacted last week were not bullish on Wang. Noting the disappointing quarterly results the company reported last week (see related story page 93), analysts said Wang is facing an uphill battle for new mid-range dollars. ``For the first time in a decade, IBM has a competitive mid-range offering,'' said Michael Geran, an analyst at Nikko Securities in New York. ``That's causing trouble for everyone. DEC is responding this week, Unisys already responded and now Wang will. It's a cat fight.'' The new system is the second recent offering from Wang that goes up against the Application System/400, IBM's rising star in the mid-range arena. The company introduced the low-end VS5000 shortly after the AS/400 announcement last year. According to Marty Gruhn, vice-president of The Sierra Group, Inc. in Tempe, Ariz., the company needs more than ``a new mousetrap'' to help it along. ``They need to focus on new business, instead of depending on upgrades,'' he said. ``They fixed up their financials somewhat in the last year by going back to the installed base. But that's over. You can't keep getting blood from a turnip.'' The VS10000 is currently installed at several beta-test sites, including Admiral Cruises in Miami, according to a Wang spokesman. David Breeze, MIS director at Admiral Cruises, confirmed that his company installed a system in December and recently went into production mode with it. He would not provide more details because of a nondisclosure agreement. The Wang spokesman said the VS10000 announcement will include software enhancements as well as a new disk subsystem that will accommodate 1G-byte disks. The high end will run the VS operating system as well as VS/VM, which allows a Wang system to run both VS and its Unix implementation concurrently. The system will be air-cooled and based on emitter-coupled logic technology, he added. By Rosemary Hamilton, CW staff <<<>>> Title : Compaq will test 386 limi Author : CW Staff Source : CW Comm FileName: 1332 Date : Jan 30, 1989 Text: Compaq Computer Corp. is again prepared to raise the ante in the personal computer speed game with a spring release of an Intel Corp. 80386-based system that runs at a 33-MHz clock speed, according to several sources close to Compaq. The PC, which reportedly will incorporate an unreleased version of the Intel processor, will break away from the 25-MHz performance plateau of current high-end systems. The $10,000 system will be housed in a Compaq Model 20E chassis. Intel's new microprocessor, which is not yet generally available to customers, will run at 33 MHz, an Intel spokeswoman said. Currently, the fastest 386 runs at 25 MHz. Compaq's strategy is to launch the machine before IBM makes its expected low-end product blitz this spring, accord- ing to a source briefed by Compaq. With the new Intel processor, the Compaq system will be capable of 1.4 million floating-point operations per second (MFLOPS), the sources said. Meanwhile, the IBM Model 70A-21, based on the Micro Channel Architecture, can attain 0.8 MFLOPS, said John Dunkle, vice-president of Work Group Technologies, a workstation research group in Exeter, N.H. Dunkle said that a 33-MHz Compaq machine would not be fully exploited until it included a 32-bit bus. With a 32-bit data path, he said, the machine could be used as a high-power computer-aided design and manufacturing workstation but will be primarily intended as a local-area network server. This strategy would allow Compaq to market the high-priced PC on a cost-per-seat basis, he said. A Compaq spokesman said the 33-MHz machine was ``not a subject up for discussion.'' An IBM source said the firm would ``respond quickly'' with its own high-speed machine based on the 33-MHz processor-based system when it becomes available. Compaq denied reports by two sources that the machine would be upgradable to an Extended Industry Standard Architecture (EISA) bus when that bus becomes available. But Compaq has previously disclosed that it intends to introduce an EISA machine in the $10,000 price range some time this year. Shortly after the EISA bus proposal was announced in September, Compaq's Director of Marketing Mike Swavely said the EISA machines would mostly be used as LAN servers. By William Brandel, CW staff <<<>>> Title : E-mail bust generates pri Author : CW Staff Source : CW Comm FileName: headless Date : Jan 30, 1989 Text: SAN JOSE, Calif. _ When deputies from the Riverside County, Calif., coroner's office raided the offices of the Alcor Life Extension Foundation, they were looking for the head of possible murder victim Dora Kent. They did not find the head, which had been cryonically frozen at death in hopes of later resuscitation. Instead, they took the foundation's eight personal computers, including the electronic mail stored within. As a result, three San Jose computer consultants, led by Keith Henson, filed a class action lawsuit against the Federal Bureau of Investigation last month for failing to investigate what they claim was a violation of the Federal Electronic Communication Act of 1986. Henson said that while the county's search warrant allowed seizure of computers and storage devices, it did not specify confiscating electronic communications and thus violated federal law. The consultants said they spent a year trying to get the FBI to check into the county's legal standing in seizing private communications without a warrant. According to the lawsuit, the U.S. Attorney's Office provided ``no substantive response'' to Henson's request for investigation. A letter dated Nov. 4, 1988, and addressed to Rep. Norman Mineta (D-San Jose) from the U.S. Department of Justice said that ``there is no competent evidence upon which to base a federal prosecution.'' The U.S. Attorney's Office, on behalf of the FBI, has yet to file an answer to Henson's complaint and refused to comment on the lawsuit. The class-action suit seeks to represent all users of E-mail as well as members of Alcor. The nonprofit organization will store all or part of the bodies of its members at death at very low temperatures ``until medical technology exists so they can be revived,'' according to Hugh Hixon, Alcor board member. ``Evading death is a very serious matter,'' he said. Specialists in computer security law say that the Electronic Privacy Act is ill-defined and has little case law to back it up. Jonathan Wallace, a New York attorney specializing in computer-related law, said the act's biggest problem is that ``it doesn't clarify [E-mail such as Alcor's] status as a closed system.'' He added that if the judge issuing the warrant was not told of the E-mail existence, then Henson ``has a decent argument.'' The act requires that a warrant can be issued for E-mail ``only if a governmental entity shows . . . relevancy to a legitimate law enforcement inquiry.'' The lawsuit asks that the FBI investigate the actions of Riverside County law enforcement in this matter. Meanwhile, the county, which is not named in the case, has handed its investigation into the possible homicide of Dora Kent to the grand jury. By J.A. Savage, CW staff <<<>>> Title : Sperry patriarch dies Author : CW Staff Source : CW Comm FileName: obit2 Date : Jan 30, 1989 Text: SALT LAKE CITY _ Gerald G. Probst, the Sperry Corp. executive credited with shaping the former conglomerate into a $5 billion computer giant and shepherding it through the 1986 merger that created Unisys Corp., died in his sleep at his home here early last week. Probst was 66 years old. The cause of his death has not been determined. A decorated World War II pilot who plied his electronics expertise as a career officer in the U.S. Air Force's Research and Development Command before joining Sperry in 1961, Probst is remembered as a reserved, considerate executive who made a difference without making a fuss. Moving from Sperry's defense side of the business to the commercial side and attaining increasingly responsible positions within the firm, Probst focused on establishing Sperry's commercial products in a credible market position. As chief executive officer of the company in the early 1980s, Probst spearheaded the restructuring of the conglomerate into a technology-oriented company, divesting it of farm equipment and hydraulics divisions in the process. ``That was a very bold move for him because a lot of people on the board were associated with those companies,'' recalled Joseph Kroger, who served as president of Sperry during Probst's chairmanship. Shortly after the merger was consummated, Probst retired from Unisys. However, he remained a member of the Unisys International Advisory Board. By Nell Margolis, CW staff <<<>>> Title : Rolm users look to Siemen Author : CW Staff Source : CW Comm FileName: ibm2 Date : Jan 30, 1989 Text: Despite an unsettling silence from prospective partners IBM and Siemens AG, some Rolm Systems users are eager for the West German electronics firm to move in and fill the service vacuum they have experienced under IBM's regime. Last December, the two vendors proposed the sale of Rolm's manufacturing and development arm to Siemens and the formation of joint service, marketing and research and development organizations. Their subsequent silence, while awaiting Federal Communications Commission approval of the proposals, has fostered widespread speculation that IBM is backing away from both Rolm and telecommunications. However, some users seemed less concerned with how soon IBM will step down than with how soon Siemens will step up to its Rolm responsibilities. ``We're hoping that Siemens will bring ISDN to the 9751 faster than IBM might,'' said Ferrell Mallory, director of communications systems at Brigham Young University. ``I can see why IBM is bailing out of Rolm; I think [the 9751] is a piece of garbage,'' said William Fallace, communications coordinator at Southern California Gas Co., whose firm recently made a major 9751 purchase. ``I would just as soon IBM bowed out completely; I've heard a lot of good things about Siemens, and I hope they'll step up with some kind of magic to fix these things,'' he added. The magic that Siemens apparently contemplates is transferring the best features of its own Saturn and Hicom lines to the Rolm private branch exchanges (PBX), and vice versa, according to company spokeswoman Susan Goff. ``Both [firms] have strengths. For example, Rolm Phonemail is fully integrated into the 9750, while we OEM our own Phonemail. Saturn has stronger data communications.'' Reassurance The company plans to link and provide common technological enhancements to its product lines, rather than converging them, Goff said. This is clearly meant to reassure users who envisioned Siemens treating its new Rolm PBX line as back numbers _ as IBM reportedly has treated the older Rolm models. ``Users would like to see Siemens restore support that Rolm was previously offering for its older 8000 and 9000 PBX lines, which IBM has been moving away from,'' said one telecommunications manager, who requested anonymity. But the big question, analysts said, is how long IBM intends to share Rolm PBX support with Siemens. ``Users are asking, `Is IBM dumping Rolm or is it serious about creating a synergy with Siemens?' '' said Bill Redman, service director of Local-Area Communications at Gartner Group, Inc. in Stamford, Conn. ``The pessimistic view is that IBM will give lip service to the venture and bow out in a few years, leaving Siemens to mismanage Rolm, lose market share and not integrate the [Rolm] 9751 with Siemens PBXs.'' IBM perplexed the industry by ``being so gung ho about shipping 9751s,'' then moving to rid itself of Rolm, according to Eric Schmiedeke, product director at Eastern Management Group. When the high-end PBX began shipping last April, Rolm jumped from third place in terms of market share _ a spot it had held since IBM bought it in 1985 _ to the No. 2 slot, just above Northern Telecom, Inc. and behind AT&T, according to the Parsippany, N.J., market research firm (see chart). But while the 9751 won sales by filling a crucial gap in Rolm's product line, IBM also got accounts by selling the PBX at well below cost, particularly to its own large mainframe shops, Schmiedeke said. Thus, many users feel that IBM's move to form a jointly owned marketing and service department with Siemens is simply ``an interim step to divesting itself of the entire Rolm unit,'' he added. Linking PBXs to hosts Rolm PBXs will continue to round out IBM's telecommunications product line, and IBM plans to work with Siemens to deliver on promised 9751 enhancements such as Integrated Services Digital Network, according to Frank Elliott, director of communications systems at IBM's marketing and services group. The Rolm switches _ as well as other vendors' PBXs _ will also play a role in IBM's plans to link PBX networking with host applications, he added. However, Elliott indicated that IBM might not be committed to servicing Rolm PBXs directly, as one of its own products, over the long term. He cited IBM's Telecommunications Services Network Support program, announced last fall, which helps users pinpoint problems on their voice/data networks and contact the right vendor to provide service. Rolm, he said, ``would come under that umbrella.'' By Elisabeth Horwitt, CW staff <<<>>> Title : Friday the 13th virus bac Author : CW Staff Source : CW Comm FileName: valvirus Date : Jan 30, 1989 Text: Dozens of firms in California's Silicon Valley were still battling last week to quash a virus epidemic that struck personal computers Jan. 13. ``The Friday the 13th virus has become a massive problem over the entire Valley,'' said John McAfee, chairman of the Computer Virus Industry Association of Santa Clara, Calif. ``Seventeen companies and scores of individuals have contacted me about this latest epidemic.'' Three machines and some 300 disks were wiped clean of the virus at EG&G Geometrics, Inc. in Sunnyvale, Calif., said Lynn Edwards, production supervisor. The virus corrupted programs on one of the personal computers at the same time it was being used in a marketing presentation to senior executives of a prospective client. The contract bid, the culmination of several months' work, had to be rescheduled. ``We found out that we were infected when we suddenly lost all of our files,'' Edwards said. ``The virus has since been cleared out, and we are in the process of putting things back together again.'' Edwards refused to say exactly how the virus managed to work its way into the company's computers. ``I am not absolutely certain, so I am reluctant to say where the virus came from because of the legal implications,'' he said. McAfee said that several of the firms and individuals that contacted him reported they had either purchased PCs or had their PCs serviced at a popular computer retail operation. He added that the outlet may have unwittingly transferred the virus to its customers' machines while formatting hard disk drives on new machines or when running diagnostics programs on personal computers being repaired. The manager at the retail operation identified by McAfee said he had no knowledge of the virus incidents. Wipeout The virus, which contained a ``time bomb'' set to go off on Friday, Jan. 13, was designed to infect and wipe out programs as they are executed. The same virus reportedly also hit hundreds of PCs in the UK on the same day. The virus is thought to be a modified version of the Israeli or Jerusalem virus that plagued computer users in Israel over a two-month period last year. That virus, believed to have been concocted as a political protest, was set to go off May 13, 1988, the day before Israel celebrated the 40th anniversary of its founding. By Michael Alexander, CW staff <<<>>> Title : Corrections Author : CW Staff Source : CW Comm FileName: fixer2 Date : Jan 30, 1989 Text: Due to an editing error, the Solbourne Series 4/600 was incorrectly referred to as the Sun-4/600 in several references [CW, Jan. 16]. Product data should have appeared as follows: The machines include the eight-model, one- to four-processor Series 4/600, which offers between 9.5 million and 30 million instructions per second (MIPS) and is capable of producing 1.6 million to 4.7 million floating-point operations per second. By way of comparison, a two-processor Series 4/602 with 16M bytes of memory, a 327M-byte disk and a 150M-byte cartridge tape yields up to 17 MIPS for $51,400 _ bettering the performance of a similarly configured Sun-4/260 by 70% at a 14% price break, a Solbourne spokeswoman said. Japanese electronics giant Matsushita Electric Industrial Co., which owns 52% of the firm and is manufacturing the Series 4, funded Solbourne's development effort with $11.75 million. <<<>>> Title : IBM profits rebound; DEC Author : CW Staff Source : CW Comm FileName: decibm2 Date : Jan 30, 1989 Text: Fueled by stellar sales of its AS/400 minicomputers, IBM logged a double-digit percentage increase in profits and a 9.3% revenue rise that left analysts assured that the industry giant is back. Meanwhile, in Maynard, Mass., hefty research and development expenses and continuing softness in U.S. sales of its high-end entries handed Digital Equipment Corp. a disappointing profit picture. Analysts, however, hailed DEC's better-than-expected revenue and placed products above profits. In Mountain View, Calif., Sun Microsystems, Inc. announced dazzling results for its second fiscal quarter ended Dec. 30: revenue of $448.3 million, up 91% from last year's comparable quarter, and a profit surge in excess of 100%, from last year's $14 million to $29.5 million. The strong sales performances of Sun, DEC and IBM reiterated a message already being sounded throughout the industry as earnings reports began flooding in early last week: Users want new products, on time, in working shape _ and will tolerate nothing less. It was a message that many companies did not heed in the December quarter (see story page 93). IBM reported fourth-quarter revenue of $20 billion, up 9.3% from last year's comparable period. Net earnings for the company increased 12% to $2.35 billion. Moreover, said Shao Wang, an analyst at Smith Barney, Harris Upham & Co., the company showed its best operating margins for any quarter since 1985. Don't ask ``The question is no longer, Has IBM turned around?'' Wang said. ``Now it's, How strong is the turn?'' The December quarter yielded several reasons to believe that the answer will be ``very,'' Wang noted. The AS/400 _ IBM's most recent entry in a minicomputer market that once appeared to be IBM-proof _ looms particularly large in the company's rising revenue picture, with continued heavy demand forecast by analysts. Other product lines, including the recently available S series mainframes, also contributed to the healthy earnings. A program launched to prompt employee attrition and consequently cut costs for the company worked and then some, paring some 6,500 from the employee rolls in place of the 3,000 to 4,000 expected by the company. While the separation costs associated with the exodus generated a onetime $270 million charge against fourth-quarter earnings, the sleek new employee profile positions IBM well for the coming year, analysts agreed. DEC's 14% revenue rise to $3.18 billion for its second fiscal quarter of 1989 _ higher than analysts had predicted _ was offset by a 15% decrease in net earnings, down to $279.58 million from $329.53 million in the second quarter of last year. ``I think the issue here is one of domestic vs. international sales,'' Smith Barney's Wang explained. ``Europe is still carrying the day for DEC.'' The company's U.S. business, noted David Wu, an analyst at S. G. Warburg & Sons, is down approximately 5%, largely because of sluggish sales of the high-end VAX 8800. However, Wang said, ``there is a certain amount of `it doesn't matter' about all of this; when it comes to DEC, all eyes are on fiscal year '90.'' By six months from now, the R&D that bogged down DEC's net income in the past two quarters is expected to show highly profitable results. This week, the VAX 6300 minicomputer is scheduled to debut (see story page 1). Wu said the new model is ``basically about one-third faster and 5% more expensive than the 6200; and they've shipped about $1 billion worth of the 6200 so far.'' Analysts are also banking on a particularly strong workstation showing by DEC and a new high-end entry this summer that will ``make the 8800 obsolete in the U.S. within the year,'' Wu said. By Nell Margolis, CW staff <<<>>> Title : CA, Microsoft log good Q4 Author : CW Staff Source : CW Comm FileName: cam2 Date : Jan 30, 1989 Text: Computer Associates International, Inc. and Microsoft Corp., the leading software companies at the high end and low end, respectively, each issued the latest in a long line of impressive quarterly earnings reports last week. CA announced fourth-quarter revenue of $309.4 million, up 59% from last year's comparable period. Net income for the quarter was $62.7 million, a 47% increase over last year's $42.8 million profit figure. In a prepared statement, President Anthony J. Wang cited the smooth integration of recently acquired Applied Data Research, Inc. as a particular reason. Microsoft reported a 35% increase in December quarter revenue, from $155.9 million in last year's comparable quarter to this year's $209.9 million. Net income was $47.5 million, up 34% over last year's profit figure for the company's second fiscal quarter. The sales surge, a Microsoft spokesman said, reflected an increasingly strong contribution from international business, now responsible for 57% of Microsoft's overall revenue. <<<>>> Title : RISC vs. CISC Author : Rosemary Hamilto Source : CW Comm FileName: risc2 Date : Jan 30, 1989 Text: When Jim Geers, president of AIM Technology, Inc., recently set out to test reduced instruction set computing (RISC) systems against complex instruction set computing (CISC) systems, he said he expected the RISC contenders to outperform the CISC challengers. But he did not expect the RISC systems to beat the CISC ones as soundly as they did. In a series of Unix-based tests, the RISC systems consistently doubled and tripled the performance and capacity of CISC architecture machines. Geers noted that for all their better numbers, the RISC systems still lack in the applications software area when put up against conventional processors. ``It appears to be a trade-off of performance and third-party software,'' he said. ``If you're in the scientific area, then the RISC performance will be attractive. If you're in a business environment, then third-party applications are likely more important.'' To test the two processor types, AIM started with a Digital Equipment Corp. VAX-11/780 as a reference point and judged the contenders against it. The test included only Unix-based systems for both sides. On the conventional side, Intel Corp. 80386-based processors and Motorola, Inc. 68030 processors were used. On the RISC side, offerings from Sun Microsystems, Inc., Hewlett-Packard Co. and Mips Computer Systems, Inc. were used. For the system performance test, AIM established the VAX-11/780 as operating at 100%. Against that 100%, the RISC systems tested at an average of 680%. The CISC challengers averaged 308%. The RISC system average was boosted primarily by the performance of the HP system, which came in at 1,004%. To establish an average number of users that the two system types could accommodate, AIM used a 12-user VAX-11/780 configuration as the reference point. The RISC systems supported an average of 91 users when compared with that base, and the CISC systems supported an average of 37 users. Geers said a comparison of processor speed alone would not be fair to users because they rely on other system components as well, such as disk, memory and floating-point performance. ROSEMARY HAMILTON <<<>>> Title : Inside lines Author : CW Staff Source : CW Comm FileName: 2liner12 Date : Jan 30, 1989 Text: Coming soon, a choice in Unix mainframes. Just when Amdahl thought it had the Unix mainframe market all sewn up after National Advanced Systems closed down its external Unix development shop late last year, a new competitor shows up on the doorstep. Feb. 13, Pyramid Technology is expected to expand its RISC-based Unix minicomputer line to low-end mainframes. And, unlike Amdahl's proprietary brand of Unix, Pyramid's is said to be based on an open standard. And, or . . . Not only is Pyramid entering the mainframe world, Gene Amdahl's latest start-up, Andor, is still intent on getting a prototype of its low-end IBM-compatible mainframe operating by this summer. Look for availability of this supposedly tiny system, with a price tag aimed at undercutting low-end IBM 3090s by late 1990. Nothing personal. It's costing DEC customers an additional $3,300 in hardware for the Vaxstation 3100 to run Decwindows, said John Logan, vice-president of the Aberdeen Group, a market research firm in Boston. The ``personal VAX,'' a diskless Vaxstation with a processor capable of 3 MIPS and a monochrome terminal, could have been sold for $5,500, Logan said. But to ensure that users would run the memory-demanding Decwindows, DEC loaded the system with 8M bytes of memory for an $8,000 price tag. For customers not interested in Decwindows, the 1-MIPS Vaxstation 2000 with 4M bytes of memory sells for only $5,200. Maybe later. As expected, Apple did not make any communications announcements at Macworld Expo, but word has it that Apple's next big batch of connectivity unveilings is slated for Feb. 8 at Dexpo. Tokentalk, Apple's token-ring card, could be slated for release in April, at either Macdex in Chicago or a Mac show in Washington, said an analyst quoting an Apple insider. This much-awaited adapter has reportedly undergone three revisions so far and will not be released until Apple has all its support pieces in place, he said. Another source adds that Apple has had a working token-ring card for about a year. Sitting down to tea. X/Open is expected to announce today new members, one of which will be a Japanese computer maker. (Fujitsu is already a member.) X/Open's current member roster consists of seven Open Software Foundation members, seven Unix International members, and one neutral firm (Nokia Data). With that lineup, one might imagine that it could be hard to reach a consensus on some issues. The new members are likely to shift the balance of power by adding more nonpartisan influence. We'll send congratulations. Apollo has scheduled a press conference for next week. Anybody showing up for something new and completely different is in for a big surprise. The company apparently intends to ``announce'' that it is now ready to deliver the graphics component on its RISC-based workstations; the workstations were announced back in in March 1988, and the graphics component was supposed to be ready by the end of the year. Although the Chelmsford, Mass.-based firm is all set to trumpet the 10000 series as the industry's first RISC-based graphics workstation, it seems that there's a little Mountain View, Calif.-based company called Silicon Graphics that has been including Mips Computer Systems RISC CPUs on its Iris 4D workstations for some time now. Ardent Computer also has a similar RISC-based machine. All in all, it makes you wonder just how gullible Apollo thinks its customers are. Hold back the dancing persons. DEC's recent desktop extravaganza went to new artistic lengths with a multimedia presentation and an opening video that portrayed dancers in an office-setting complete with DEC workstations. Not seen, however, were live dancers who were to act out the video. They were canceled at the last moment after somebody gave it a second thought. If you know the details to this seamy inside political scenario, call News Editor Pete Bartolik at the hot line number, 800-343-6474 or 508-879-0700, and we'll let everyone in on it. ET <<<>>> Title : DEC settles VAXBI suit Author : CW Staff Source : CW Comm FileName: emc2 Date : Jan 30, 1989 Text: Digital Equipment Corp. and EMC Corp. settled their legal differences out of court last week, with each company declaring itself the winner. Under the terms of the accords announced last week, EMC acknowledged infringement of DEC technology rights; the $127 million firm agreed to pay DEC $100,000 for patent infringement and to cease making, using or selling the VX82 and VX83 memory products. ``Our hearts said fight, but our brains said settle,'' said W. Paul Fitzgerald, vice-president of finance at Hopkinton, Mass.-based EMC. He contended that his company does not believe it infringed on DEC's property rights but was nevertheless willing to concede to the charge to forestall a costly litigation that could lead to no better than a Pyrrhic victory. The products in question, Fitzgerald said, had produced so little revenue that they had already been targeted for deletion from the EMC line. On the other hand, facing DEC in court was likely to cost at least four times what EMC paid DEC in the settlement, he added. In related news, DEC and EMC filed a consent judgment and settlement agreement with regard to EMC's outstanding breach of contract claims against DEC. Both parties agree to uphold the original contract, which, Fitzgerald said, was EMC's goal. By Nell Margolis, CW staff <<<>>> Title : Apple's SE/30 bridges Mac Author : CW Staff Source : CW Comm FileName: macnew3 Date : Jan 30, 1989 Text: SAN FRANCISCO _ Apple Computer, Inc. last week introduced a Macintosh that users say is a bridge between the economy of entry-level systems and the power of the pricey Macintosh II. Additionally, during introductions at Macworld Expo here, Apple unveiled enhancements to A/UX, its version of the Unix operating system, and Macworkstation, a tool for developing applications that allow the Macintosh to retain its ``look and feel'' when connected to a mainframe. An extension to the Mac SE line, the Mac SE/30 is powered by Motorola, Inc.'s 68030 microprocessor running at a clock speed of 16MHz. Apple officials said it is up to four times faster than the entry-level Mac SE, which is based on the Motorola 68000. Apple officials have said the Mac SE, which features the traditional Mac look _ a single unit for the monitor and base _ will form the basis of one product family. The modular Macintosh II has spawned a second product family. Mac SE and II users will be offered upgrade packages that allow them to migrate to more fully featured members of each respective product family. An upgrade kit allowing Mac SE users to attain Mac SE/30 functionality will be announced in the spring, officials said. The Mac SE/30 will be offered in three configurations: an entry-level version with a single floppy disk drive; a mid-range model with a 30M-byte hard drive; and a high-end version with an 80M-byte hard drive. They cost $4,369, $4,869 and $6,569, respectively. The entry-level and 30M-byte hard drive configurations offer 1M-byte of random-access memory, while the 80M-byte hard drive model comes standard with 4M bytes of RAM. All three models offer Apple's 1.44M-byte ``high-density'' floppy drive, also called the ``Superdrive,'' which can read, write and format Microsoft Corp. MS-DOS and OS/2 diskettes. Used with the utility Apple File Exchange, Mac users can access and transfer MS-DOS and OS/2 files. While Apple has been bashed for its pricing structure, the prices of the new systems along with price cuts up and down its lines (see story, page 7) were greeted warmly. ``It's better pricing than I thought it was going to be,'' said Mary Howlett, manager of office automation for Hughes Aircraft Co.'s Ground Systems Group. ``It offers us a nice middle-of-the-road system,'' she said. ``We don't have to jump to a Mac II to get more power.'' Howlett said the 68030 will offer users enough power to take advantage of an anticipated new version of the Mac operating system, which will reportedly offer multitasking capabilities. Edwin Sund, senior systems engineer for Weyerhaeuser Information Systems' PC support group, said the Mac's improved speed is a ``godsend. Where you really need it is in networking and database applications. ``We're not going to dump our old Macs and buy [Mac SE/30s],'' he maintained. ``It'll be the workstation shuffle. The old Macs will be handed down to people who don't need the power, and we'll replace them with the newer models.'' Macworkstation 3.1 Macworkstation Release 3.1 adds enhancements that will allow developers to create applications more simply then previously. Developers can paint dialogs directly on the screen rather than enter code to generate dialog boxes. Also, an event handler offers local intelligence for both the front and back ends. A new release of A/UX adds support of X Window System Version 11, Release 3; offers the ability to run Hypercard and other desk accessories from the Apple Toolbox; and eases the development of Mac applications able to run on both the Mac operating system and A/UX. A criticism of the first version was that many Mac applications would not run under A/UX as originally promised. Also users felt that it did not take advantage of the user-friendly features of the Mac operating system. The new version will ship in March at a price of $595 on diskettes. Support of X Window Version 11 Release 3 is an additional $329. As usual, Apple is setting high expectations for itself. ``We believe it is our fate to develop the most influential intellectual and cultural tools known to mankind,'' said Jean-Louis Gassee, president of Apple's Products Division. By Julie Pitta, CW staff <<<>>> Title : Apple takes a slice off M Author : CW Staff Source : CW Comm FileName: macs2 Date : Jan 30, 1989 Text: SAN FRANCISCO _ Apple Computer, Inc. reduced prices on several models of its Macintosh personal computer last week to coincide with the debut of a system at Macworld Expo. The reduction comes nearly four months after the company announced a dramatic price increase, which it blamed on the high cost of memory components. That hike was met with widespread criticism from Apple customers. While users were pleased with Apple's pricing changes, investors were apparently turned off by the company last week. Releasing its latest quarterly earnings report, Apple indicated that gross margins had decreased, prompting a decline in its stock price (see story page 93). Apple USA President Allan Loren said last fall's increase altered customer buying patterns, turning them from more expensive, fully featured Macintoshes and toward cheaper, entry-level models. As a result, Apple is selectively dropping prices to ``fully expand momentum'' for the line, Loren noted. Also, memory components costs are falling, a trend that is expected to continue, he said. Prices for four memory-loaded Macintoshes were reduced. The Motorola, Inc. 68000-powered Mac SE with 2M bytes of random-access memory and a 40M-byte internal hard disk drive was reduced to $4,369 from a previous level of $5,069, a decrease of 14%. A Motorola 68020-based Mac II with 4M bytes of RAM and a 40M-byte internal hard drive was cut to $7,369 from $8,069, a reduction of 9%. Two models of the recently introduced Mac IIX, a Motorola 68030-based system, were also affected. The price of a Mac IIX with 4M bytes of RAM and a flexible disk drive was trimmed by 10% to $6,969 from an earlier price of $7,769. A Mac IIX with 4M bytes of RAM and an 80M-byte hard drive was slashed to $7,869 from $9,369, a 16% drop. Apple also cut prices on certain memory expansion kits for Macintoshes and a Laserwriter printer by 17%. Hard disk drive upgrades for the Mac were reduced by between 18% and 24%. Users expressed pleasure at the price cuts. ``It couldn't happen at a better time,'' said Edwin Sund, senior systems engineer at Weyerhaeuser Information Systems' PC support group. ``We think there may be a downturn in the economy. We don't want to spend any more money than we have to.'' By Julie Pitta, CW staff <<<>>> Title : Open Link firms Novell, A Author : CW Staff Source : CW Comm FileName: macnov2 Date : Jan 30, 1989 Text: SUNNYVALE, Calif. _ Apple Computer, Inc. and Novell, Inc. last week solidified their relationship by introducing the jointly developed Open Link Interface specification for third-party developers. Introduction of the specification coincided with Macworld Expo in San Francisco and represents the second significant link between the two companies. Earlier this year, Novell introduced Netware for Apple's Macintosh, allowing Macs and IBM Personal Computers to coexist in a local-area network through the use of Novell communications software. Novell also introduced source-routing drivers for IBM Token-Ring networks jointly developed with Ungermann-Bass, Inc. The product is said to allow users on Token-Ring networks running Netware to communicate across bridges using IBM Token-Ring Network Bridge software. The drivers are scheduled to be available sometime this quarter for $75. Guidelines for development The Open Link Interface is targeted at developers of LAN protocols and adapters. Officials at both firms said it offers guidelines for developing the interface between LAN adapters and protocols on Microsoft Corp. MS-DOS and IBM OS/2 platforms. As a result, developers using the Open Link Interface can design products that interoperate, they added. Nina Burns, vice-president of Infonetics, Inc., a Santa Clara, Calif., market research firm, said Apple's involvement in Open Link indicates the firm's commitment to networking. ``It provides a really good platform for independent third-party card makers and LAN operating system vendors other than Novell,'' she said. The specification is available to LAN vendors. A new release of Netware scheduled for later this year will include the implementation of the Open Link specification. A developer's kit for protocol vendors that will include Netware is expected in the second quarter for $3,000. A developer's kit for LAN adapter vendors including Netware is expected in the second quarter for $7,500. By Julie Pitta, CW staff <<<>>> Title : News shorts Author : CW Staff Source : CW Comm FileName: short123 Date : Jan 30, 1989 Text: Norris taps Soviet know-how A Soviet computer science course, which has been under development by Soviet scientists for five years, is getting a careful look-see by U.S. computer scientists _ but no military intelligence decoders will be needed for this effort. The William C. Norris Institute here, named after the Control Data Corp. pioneer, announced last week that it has concluded negotiations with the Soviet Academy of Sciences and Zodiak Computer Centre of Moscow to establish a joint U.S./USSR venture to develop and market computer technology-based software and courseware in both the East and West. The first product will be based on a Soviet course that offers a unique approach to teaching computer science, according to William C. Norris, chairman of the institute. Banks launch EDI service First Bank System (FBS) has entered into a joint venture with Westinghouse Electric Corp., Harbinger Computer Services, Citizens and Southern Bank in Atlanta and Marine Midland Bank in Buffalo, N.Y., to launch an electronic data interchange (EDI) service for the banking community. Headquartered in Atlanta, the new company, HarbingerEDI Service, is offering Intouch EDI, which consists of personal computer software and a network service compatible with the ANSI X.12 formats. It reportedly will link to other EDI systems. In particular, HarbingerEDI hopes to attract smaller firms that typically do not use EDI for other than payment services. FBS, along with the other two bank partners, are the only U.S. banks that have ventured into full EDI marketing, according to Terry Sandvik, a senior vice-president at FBS Cash Management Corp. Only a small percentage of U.S. financial institutions can receive corporate trade payments and properly process remittance information, he added. Sprint enters price war U.S. Sprint Communications Co. slashed its Clearline T1 service prices last week, offering customers potential savings of up to 66% over AT&T, the carrier said. Base prices for the service will fall as much as 25%, and volume prices will come down by as much as 48% as of April 1, Sprint said. The cuts were made in response to AT&T's recent spate of price reductions for its Accunet services as well as increasing competition from independent fiber-optic-based carriers, a company spokesman said. MCI Communications Corp. has yet to respond to AT&T's cuts with a similar tariff. Microsoft powers up LANs Microsoft Corp. has announced a minor upgrade for its OS/2 LAN Manager that reportedly will enable a LAN Manager-based network to support a virtually unlimited number of users and applications running concurrently. The upgrade is slated to ship to OEMs in March and will be offered to LAN Manager sites at no charge. According to company officials, the upgrade is said to increase the number of ``file handles'' in LAN Manager from the current 255 up to 64,000 and has no impact on other activities such as copying files to a workstation. File handles are a mechanism used to enable programs to access files and vary in number according to the application in use. This, in turn, regulates the actual number of users who can concurrently access a single package. Wang into disaster recovery Wang Laboratories, Inc. announced a program that provides equipment, services and personnel to restart computer operations following damage to Wang equipment. The fee for the Disaster Recovery Services program, an option to Wang's hardware maintenance service contract, is said to be 1% of the total cost of the customer's Wang equipment. The cost includes equipment repairs and replacement, shipping, installation and support time, the company reported. It will also cover the price of using an alternative processing site and expenses that exceed a customer's normal processing costs. Wang said on-site response to a customer's call would take place within an average of four hours. ET <<<>>> Title : X.400 users get E-mail br Author : CW Staff Source : CW Comm FileName: email2 Date : Jan 30, 1989 Text: Two electronic mail vendors, AT&T and Dialcom, Inc., announced the first commercial interconnection between E-mail services using the CCITT X.400 protocol in the U.S., which will enable users of the different E-mail services to exchange messages. The interconnection will be commercially available during the first quarter of this year, the vendors said. AT&T spokesman Jim McGann said there will be no additional charge for AT&T Mail users who send messages to Dialcom's E-mail service. Pricing has not been determined for messages sent from Dialcom to AT&T Mail, said Karen Chun, director of marketing services at Dialcom in Rockville, Md. AT&T's decision reflects a trend among electronic messaging services suppliers to not double-charge subscribers who are sending messages to competing electronic data interchange services. It remains to be seen how much user demand there is for E-mail interconnection, Chun said, but she noted that the Aerospace Industry Association's initiative to create a multivendor E-mail network for the industry is an example of a business application for interconnected E-mail services [CW, Jan. 16]. Both AT&T and Dialcom are involved in that project. X.400 connections between different E-mail services have been technically feasible since late 1987, but it has taken until now for vendors to hammer out agreements on revenue distribution and other difficult administrative matters, according to Michael F. Cavanagh, executive director of the Electronic Mail Association in Washington, D.C. Too many cooks Another issue stalling X.400 interconnection is that mail providers have been taking different approaches to building X.400 gateways. Even though the links technically conform to the standard, they cannot talk to each other, users have charged. AT&T and Dialcom _ apparently the first to develop a revenue distribution deal _ have been leaders in the industry's International Administrative Management Domains Operations Group, which is working to develop agreements on financial accounting between interconnected E-mail services. In addition, AT&T and Dialcom announced an agreement in which users of FTS-Mail, the E-mail service AT&T will provide to the federal government under the Federal Telecommunications System 2000 contract [CW, Dec. 12], can get access to Dialcom's news and database services. For example, FTS-Mail users will be able to access Dialcom's Procurenet, which allows government agencies to send bid solicitations electronically to a typesetter for publication in the government bulletin ``Commerce Business Daily.'' The price of access to Dialcom's database services reportedly is still under negotiation, the vendors said. Dialcom, which already provides information services and E-mail for 62 federal agencies, will also provide X.400 connections between its current federal customers and FTS-Mail users, a spokeswoman said. By Mitch Betts, CW staff <<<>>> Title : HP opens LAN doors to PC Author : CW Staff Source : CW Comm FileName: hpnews2 Date : Jan 30, 1989 Text: Hewlett-Packard Co. last week outlined a two-phased approach to OS/2 connectivity that encompasses support for DOS, OS/2 and Unix. This support extends to a mix of networks, including Ethernet, token-ring and broadband systems. HP LAN Manager, a version of Microsoft Corp. OS/2 LAN Manager, will enable DOS and OS/2 workstation users to access OS/2 file servers [CW, Jan. 16]. In the second phase, HP LAN Manager will be integrated with HP LAN Manager/X Operating System 3 (LMX), a Unix version of LAN Manager co-developed with Microsoft. This reportedly will enable DOS and OS/2 users to go an extra step and access Unix-based servers such as the HP 9000 minicomputer. ``OS/2 is important to us,'' said Herschel Kenny, a system supervisor at Allied Signals, Inc. in Morristown, N.J., and an HP user. ``Anything that will help us connect our PCs to our minis will be of assistance.'' Together, the two software packages provide users with access to increasingly more robust server services. HP is also encouraging developers to build integrated OS/2- and Unix-based applications. Also, Transmission Control Protocol/Internet Protocol support coupled with Arpanet services support will enable PC users to avoid gateways when accessing applications and resources on compatible Unix-based office, engineering and manufacturing computers. Both packages will be supported under HP's Openview. ``HP's version of LAN Manager with hooks into Openview will allow users to centralize LAN management,'' said David Passmore, an analyst at Ernst & Whinney in Fairfax, Va. Scheduled to be available in the second half of the year, HP LAN Manager software will be priced under $3,000. By Patricia Keefe, CW staff <<<>>> Title : Opening more Windows Author : CW Staff Source : CW Comm FileName: micro Date : Jan 30, 1989 Text: REDMOND, Wash. _ So far, Microsoft Corp. has only lethargically supported its Windows interface with applications. But the company will spring into action with a rash of product releases beginning early this year and spanning the next two years. Although Windows was announced in November 1983, after more than five years Microsoft still has only one major Windows application available with its Excel spreadsheet. This will change sometime in the first half of this year when the firm that has so ardently preached the Windows way will finally announce the $495 Windows Word. A word processor, Windows Word is in the late stages of beta testing. Following on its heels will be Omega, a graphical database management system currently in beta testing that will be out well before year's end, Microsoft said. Eventually, most key Microsoft applications will be moved to both Windows and the OS/2 Presentation Manager, Microsoft officials said. Windows, however, has proven to be a difficult and complex environment to develop for, even for Microsoft. As a result, some of these applications may be a long time coming. ``Microsoft is like a Japanese company. They have the longest view of things of anybody. They are always thinking five years out,'' said one user at a Microsoft beta-test site. Despite its sluggish move to Windows, the firm plans a quicker ramp-up for the equally graphical OS/2 Presentation Manager, with a mid-year release of Excel/PM. This product will essentially be a port of today's Excel for Windows with a $50 upgrade charge, said Pete Higgins, general manager of Microsoft's analysis business unit. Also on the horizon are Windows and Presentation Manager versions of all key Microsoft applications. The firm will develop an entirely new version of Microsoft Project for graphics environments and will port versions of Microsoft Works and Powerpoint, a Macintosh presentation package, to Windows and the Presentation Manager, said Mike Maples, vice-president of applications at Microsoft. Windows Word duplicates all the features of the $495 character-based version of Word. The product includes a runtime version of Windows/286 and features a thesaurus, spell-checking, autosave, on-line Help, advanced formatting and the ability to customize the system for individual tastes and work styles, a beta tester said. This product will also provide the setting for the debut of Microsoft's embedded macro language based on Quickbasic. This language, first discussed in October 1987, will eventually work across all graphical Microsoft applications and will also serve as the development language for Omega. As far back as October 1987, Microsoft dropped hints that Basic would be positioned as a database development language. Omega, which some beta testers expect to be ready in the third quarter, will use this language. ``It is like Dbase Windows,'' a beta tester said, referring to the fact that a programming language is central to the product. Ashton-Tate Corp. has the Dbase language; Microsoft has Basic. Omega will also serve as a front end for SQL Server, a database engine developed by Ashton-Tate, Microsoft and Sybase, Inc. Like Quickbasic, this macro/database development language generates pseudocode, or p-code, which essentially allows users to pseudocompile programs as they edit. The system created code that is ``90% along the way to real native machine code,'' Microsoft explained. Basically enhanced Basic, however, has been enhanced for the graphical environments. ``There are keywords and parameters that make it particularly good for Windows, dealing with things like fonts,'' the beta tester said. There is more to Omega than just Basic. The product also contains so-called nonprocedural tools for reporting, querying and working with forms. This positions the product against Lotus Development Corp.'s Lotus/DBMS, which plans to provide a similar array of graphical database tools. The key difference, at least on the surface, is that Lotus will initially target the OS/2 Presentation Manager, while Microsoft will first squeeze its tools into the confines of Windows. With the inclusion of Basic as a database development language, Omega will also be aimed squarely at Dbase, the DBMS from Microsoft SQL Server partner and rival Ashton-Tate. Although a handful of programmers are already using Omega for applications development, the product is ``still very flaky _ prealpha really,'' said one East Coast beta tester. Who should use Windows Word? ``If you have a 6 MHz AT or less, I would recommend PC Word. If you have a 10-MHz 286 or more, I would recommend Windows Word,'' said Jeff Raikes, general manager of Microsoft's office business unit. Raikes said Word for the Presentation Manager should ship about three to nine months after Windows Word. By Douglas Barney, CW staff <<<>>> Title : M&D PIOS users get sold o Author : CW Staff Source : CW Comm FileName: aamd2 Date : Jan 30, 1989 Text: DALLAS _ In a move that left users of the PIOS manufacturing resource planning system puzzled about their future, McCormack & Dodge Corp. handed PIOS over to Arthur Andersen Consulting last week. With an installed base of 75 sites, PIOS is used by a number of large defense contractors. The transaction is part of an agreement between M&D and Andersen Consulting under which the two firms will jointly sell M&D's Millenium financial and human resources software and Andersen Consulting's Mac-Pac family of manufacturing software. Mac-Pac has roughly 600 installations, according to Andersen Consulting. As part of the agreement, the two firms are integrating Mac-Pac with M&D's Millenium software. Although Andersen Consulting promised three years of maintenance for PIOS _ which stands for Production and Inventory Optimization System _ the firm also said enhancements will have to be paid for by the user base. An Andersen Consulting official said the firm would not sell PIOS to new accounts. M&D employees who had worked on PIOS development and marketing will be offered positions with Andersen Consulting under the pact. Andersen Consulting said it has no plans to acquire M&D, now part of The Dun & Bradstreet Corp. PIOS never caught on M&D acquired the rights to the PIOS package four years ago from Rath & Strong, Inc., a Lexington, Mass.-based consulting firm, in an effort to diversify its product line. Although M&D sought to sell the package to the commercial market, it never caught on there. ``I'm disappointed. I think it's a good product that's going to go down the tubes for all the wrong reasons,'' said PIOS user Bob Herzog, director of Information Systems Services at Combustion Engineering in Windsor, Conn. Herzog said his firm has no intention of moving to Mac-Pac. If Andersen Consulting withdraws support for the package later on, users should continue with development on their own, he said. Malcolm McNeil, director of information services at Santa Barbara Research, a division of Hughes Aircraft Co. in Golita, Calif., said he cannot change from PIOS soon. ``There are a lot of users who have been trained, and a lot of money has been spent,'' he said. McNeil's firm has been migrating to PIOS for the past several years. PIOS now runs with IBM's IMS and Cullinet Software, Inc.'s IDMS databases but not with IBM's DB2. Paul Bellwood, at Northrop Corp. in Los Angeles, said he is interested in a DB2 version, which he said he understands is under development. However, an Andersen Consulting official said there are no plans to offer a DB2 version of PIOS. Several users said that M&D did not comprehend the selling cycle for manufacturing software, which is typically several years, in contrast to much briefer periods for financial software. Rumors of the sale of PIOS to Andersen Consulting surfaced at the PIOS users group meeting in December. The group meets again Jan. 30 in Dallas. Andersen Consulting has been invited to address the group, said Bellwood, who is chairman of the users group. Although Andersen Consulting reportedly paid M&D for PIOS and its associated employees, the amount was not disclosed. M&D bought PIOS because many of its customers were manufacturing companies that wanted to sell a full range of products. PIOS, however, did not lend itself to being integrated with M&D financials, Bellwood said. M&D had recently lost money from its PIOS group, one source indicated. By Stanley Gibson, CW staff <<<>>> Title : A case of shifting priori Author : CW Staff Source : CW Comm FileName: dcbox2 Date : Jan 30, 1989 Text: In 1978, most of the staff allocations were applied to applications development, with computer operations taking second place and data entry third. Only a negligible 3% of the staff was allocated to support end-user computing that year. <<<>>> Title : Unisys Micro A desktop ti Author : CW Staff Source : CW Comm FileName: usis Date : Jan 30, 1989 Text: NEW YORK _ Unisys Corp. last week announced the Micro A, a desktop extension of its A series mainframe architecture that the company hopes will capture as much as 40% of its sales from new accounts and drive prospective Application System/400 users away from IBM. However, observers said that the system could till new soil for Unisys that could later nurture sales of larger processors but that the application-deficient system will cause little problem for the AS/400. With the Micro A, available today, the company has condensed the A series mainframe on a single chip, called the Single Chip A-Series Mainframe Processor, or Scamp. The chip, along with 12M bytes of memory and I/O logic, has been placed on an expansion board that slides into a specially equipped Unisys PW2 Series 800 personal workstation. The Micro A runs the A series MCP/AS operating system, which adds $5,000 to the hardware cost of $20,365. In addition, the Micro A contains an Intel Corp. 80386 processor and the Microsoft Corp. OS/2 operating system, which serve as the I/O processor and maintenance subsystem. It also contains a board for a small computer systems interface (SCSI) host adapter. According to Jeffry Beeler, an analyst at Dataquest, Inc. in San Jose, Calif., both OS/2 and the SCSI interface are critical for integration into accounts _ for example, in the banking industry _ that necessitate interoperability with platforms such as the IBM Personal System/2. Although the company referred to the Micro A as a desktop mainframe, Neil Waddington, vice-president of corporate marketing and services, said that ``the Micro A will not be used on the desk top because it can support up to 16 users,'' or three programmers developing applications with Linc, the company's fourth-generation language (4GL). According to Waddington, approximately 65% of the systems will be sold with Linc, which will bring the total system cost to approximately $50,000. Mapper, the company's other 4GL, will reportedly be available in the second half of this year. Eric Thomas, director of information systems and services at Lincoln Hospital in Phoenix and an A10 Model F user, called the price of the Micro A with Linc phenomenal because it will allow Linc programmers ``to develop their little hearts out without impacting mainframe users.'' But Thomas said that he will not buy a Micro A until applications software for nursing stations becomes available. The paucity of applications, however, could militate against Unisys' attempt to pit the Micro A as a weapon against IBM's formidable AS/400, said Peter Burris, an analyst at International Data Corp. in Framingham, Mass. The AS/400's applications number in the thousands. By Robert Moran, CW staff <<<>>> Title : By 1988, user-related act Author : CW Staff Source : CW Comm FileName: dcbox3 Date : Jan 30, 1989 Text: By 1988, user-related activities had risen substantially on the staffing allocation agenda, while the percentage of the computer operations staffing was only half of the 1978 allocation level and data entry staffing was down to only 1%. <<<>>> Title : The company is forecastin Author : CW Staff Source : CW Comm FileName: dcbox4 Date : Jan 30, 1989 Text: The company is forecasting even greater changes for 1998, ones that will further deemphasize staffing for traditional data center functions for the following reasons: Intel Corp. 80486/586 chip technology will dominate the market. Computer-aided software engineering will mature. Commercial software offerings will continue to improve. Automated operations will be tightly integrated with the mainframe operating system. Bar coding and optical scanning will eliminate data entry. Optical-disk technology will be used for storage. Automated network linkage will be in place. <<<>>> Title : Paying the piper Author : CW Staff Source : CW Comm FileName: j23edit Date : Jan 30, 1989 Text: WHAT ONE HAND gives, another can easily take away. In the wake of the U.S. Supreme Court's affirmation of states' rights to tax interstate communications traffic, businesses stand to lose a substantial amount of the interstate rate savings that divestiture of the Bell System has produced. So, what can you do about it? Swallow hard and get ready to pay up. About a dozen states either have or are considering tax measures to raise revenue from communications links. With reports popping up continually about another state facing a budget deficit, this is going to be an easy one for governors and state legislators to enact. Relocating headquarters and branch offices is not a realistic alternative for most businesses. The growing trend among the 50 states to find ``revenue enhancement'' alternatives to income-tax hikes makes any relocation plan a risky gamble that could prove fruitless next week or next year when the taxman surfaces in the new locale. For those businesses that are where they are because they have to be there, it's time to start thinking about tax-rebate strategies. If two states tax the same communication, one or both is going to have to come up with a whole or partial rebate, to eliminate double taxation. That's going to require paperwork on your part, but more important, it's going to call for up-to-date communications hardware and software capable of providing businesses with information to the most minute detail for both voice and data traffic. Finally, there is as much reason for applying information systems tools to voice traffic as there already is for data traffic; we may not like it, but the tax needs of the 50 states may finally bring the justification for full integration of the voice and data communications organizations of today's businesses. <<<>>> Title : Going down? Author : CW Staff Source : CW Comm FileName: jj23edit Date : Jan 30, 1989 Text: Pity the poor microcomputer vendors. When memory chip prices went through the roof last year, they reluctantly raised system prices to compensate. ``We have no choice,'' they said. ``These prices are just eating us alive.'' Not anymore. Chip prices are headed down _ way down, if some predictions are to be believed. Already, volume prices are less than half of what they were seven months ago. But have PC prices come down yet? Well . . . Few will argue with the rights of PC makers to earn a profit. But when vendors cloak their actions in the guise of market forces, they owe their customers an explanation when those forces change. At this point, all they have said about reducing prices is that they'll think about it. If this explanation strikes you as inadequate, ask for a better one. If one is not forthcoming, consider letting your wallet do the talking. <<<>>> Title : Cloning the Macintosh: A Author : G. McKendre Hayn Source : CW Comm FileName: haynesle Date : Jan 30, 1989 Text: I recently read Douglas Barney's editorial on the lack of a Macintosh clone [CW, Dec. 19] and thought you should know that there is one available today. A low-cost Mac clone requires an Atari 1040ST, Mega 2 or Mega 4 computer with a monochrome monitor. Also required is a ROM cartridge and software called Spectre 128 (list price $179.95) from Gadgets by Small and 128K-byte Mac ROMs (available for about $150). A hard disk is optional. Also, a hardware/software package enables the Atari to read the Macintosh floppy disks. This setup will not only run the color software written for the Macintosh II but all software that runs on the Macs with 128K-byte ROMs, also. And since the Atari computers use the Motorola 68000 CPU running at 8 MHz, the Mac software will run just as fast. Additionally, the Atari monitor is larger and has a slightly higher resolution than the Mac monitor. Based on an Atari 1040ST (at least 1M byte of RAM is required), one should be able to assemble a Macintosh-compatible system at a cost below $2,000. G. McKendre Haynes Oconee Nuclear Station Computer Services Seneca, S.C. <<<>>> Title : Cloning the Macintosh: A Author : Ron Howe Source : CW Comm FileName: howelet Date : Jan 30, 1989 Text: I found Douglas Barney's editorial on cloning the Mac to be amusing, since it tells only part of the story. While it is true that a Mac clone would be a welcomed addition, it is not true that buying a PC clone will save you money. First of all, the Mac is not slow. True, the program startup is slower than a PC because of the things the program must do at the beginning; however, just about everything else is faster. It is costing you tens of thousands of dollars a year more in lost time to use a PC instead of a Mac. Think about that the next time you are congratulating yourself for sticking with a PC. And while you PCers are trying to get Presentation Manager to work (to bring yourselves up to where Apple was in 1985), rumor has it that Apple is going to announce new computers in 1989. This will mean that for a few thousand dollars more, you can drive your Mac Corvette or your PC Hyundai. The choice is yours, but which users do you think will move ahead in the company? Ron Howe Sr. Database Designer Computer Task Group Raleigh, N.C. <<<>>> Title : Cloning the Macintosh: A Author : Kevin M. Rahe Source : CW Comm FileName: rahelet Date : Jan 30, 1989 Text: In response to Douglas Barney's article, I suggest he take a look at the Amiga 2000 for a system with the qualities of the Apple Macintosh and the value of an IBM PC AT clone. While it has a mouse-driven windowing user interface similar to the Mac, the Amiga comes with a full-function keyboard for those who would rather not depend on a rodent to get their work done. And since every Amiga is equipped with graphics coprocessors to off-load the screen-drawing function from the CPU, the Amiga is not as sluggish as the low-end Macs. The Amiga not only offers this value, it also offers the software and hardware compatibility of an AT clone, so you can keep your PC software when migrating to more versatile and powerful Amiga applications. Finally, Amiga offers multitasking with hundreds of application programs that work well under its environment. Cranking up your word processor while working on a spreadsheet is as simple as point and shoot. Kevin M. Rahe Comstock Park, Mich. <<<>>> Title : Has anybody seen a few de Author : Michael Cohn Source : CW Comm FileName: cohn7a Date : Jan 30, 1989 Text: The first thing I asked myself Monday morning as I went in search of my new office was why did I wear my best suit. Just a week before, I had been stunned when the memo came across my desk. I was expecting to see another ``We've postponed the recently postponed relocation of the Computer Center to the new building. We'll keep you posted as to when we postpone it again'' message. But instead, the three-sentence memo took us all by surprise: ``The Computer Center move will occur this weekend. All programming staff should have their terminals, furniture and moving boxes labeled and secured by 5 p.m. Friday. Please contact the operations manager for additional information.'' I don't handle Monday mornings all that well, anyway. But when I came in and found the assortment of boxes, terminals and cables creatively sprawled on the floor of my new office, I immediately wondered whether it would be worth it to go home and change clothes or maybe just go home. A few doors down, I found the office of my programmers. Ron was already acclimating to his surroundings. He was preoccupied with breaking down his moving boxes and trying to stack them flat in the corner. Regrettably, he had not completely emptied them first. Ron seemed a bit upset. ``I knew this would happen,'' he moaned. ``They lost two of my boxes. And look, no furniture _ no desks, no chairs, no credenzas. Where am I supposed to put my stuff? And who knows when they'll get around to getting the system up.'' I noticed that Ron's terminal was turned on, sitting right in the middle of the floor. The image of Ron lying on his stomach and banging on his keyboard didn't surprise me. I figured I'd better get back to my office and call operations before things got out of hand. By the time I unpacked my phone, discovered a working phone jack and got through to the operations manager, it was nearly 11:30 a.m. ``What's going on here?'' I asked. ``It's almost lunchtime. We have no desks. And the system's still down. I've got to run a cycle tonight. What's the story?'' ``Look, I'm doing everything I can, OK?'' he answered. ``I've got people complaining because I lost some of the desks. Then they're complaining because I lost some of the production tapes. If everyone would be a little patient, I could straighten the whole thing out.'' ``You lost production tapes?'' Things were getting worse by the minute. ``How much did you lose?'' ``Heck, probably less than 1%. Just the first 18 or 20 feet from every reel. But I'll take care of it. It's not like we didn't back everything up to cartridges. Locked them all up in a couple of desk drawers.'' ``Are those the desks you've lost or the desks you've found?'' I asked, only half joking. ``Well, it really doesn't make a difference, because no one knows where the keys are.'' Sitting on the floor At 4 p.m., there was still no news from the furniture front, and everyone had resorted to making little piles of books and papers on the floor. The system had come up long enough for us to slip in a few reporting jobs. I called the operations manager again to check that the output was printing somewhere. ``The printers? Don't worry, they're running fine. We left them in the old building for now.'' ``Great,'' I said, ``but how do I get the printout?'' ``I already thought of that. We send a truck out there three times a day. In fact, the third one left just 20 minutes ago to look for the first two trucks.'' I was pretty fed up. ``Look, I don't mean to complain. I know you have a tough job. But didn't you have this thing scheduled for weeks? Didn't we have walk-throughs? Recovery tests? File backups? Would it be too much to ask to get a desk or two over here as soon as you find all your tapes, printers and disk packs?'' ``Disk packs? No one said anything about moving any disk packs.'' I heard a loud crash in the background. The phone suddenly went dead. It looked like it was going to be a long night. By Michael Cohn; Cohn is a quality assurance representative based in Atlanta. <<<>>> Title : How to suceed as a manage Author : Dennis Noonan Source : CW Comm FileName: noonan1 Date : Jan 30, 1989 Text: It was scary when I first got promoted to management. I had read the Peter Principle and was fearful that I would be incompetent as a manager. When I admitted my doubts to a neighbor, Joe, he told me not to worry. ``Trust me,'' he said. ``You can't be any worse than most of the so-called managers out there. You'll be fine. Just don't let them see you sweat.'' Joe was a management consultant, so he always spoke with authority. It was easy to believe him because he always told you what you wanted to hear. Clearly, my strategy of ``creative incompetence'' had failed. Despite my squishy shoes, tweedy sport coat and randomly matching socks, here I was, a project manager with seven project members to lead down the path toward excellence. Somehow the system had failed. I should have been passed over. Joe said that my strategy probably failed to take into account how the others were dressed. Incompetency principle The Peter Principle was a crucial book for me. The author pointed out that competent people keep getting promoted until they reach a job that exceeds their ability. In a bureaucracy, incompetence is viewed as a barrier to further promotion, not grounds for termination. Thus, incompetent managers are doomed to remain in place, making themselves and everyone else miserable. I had seen it happen myself when one of the programmers or systems analysts got promoted. In the beginning, there were well-wishes from former fellow workers. The promotee vowed to help make a better world. For a few months, the new manager continued to associate with former peers. Advice and feedback were solicited. And the work world seemed to be a better place. This feeling lasted until the first big crisis. Leadership engenders piles of paperwork. The technical skills that got the executive promoted do not help in coping with the administrative demands of bosshood. The new manager quickly realizes he has not been prepared for this role. He sees that there are two distinct worlds: theoretical and real. All the educational information is theoretical. All the problems are real. Time, the scarcest resource of management, becomes too precious to be squandered on nonpriority activities. Inevitably, the boss' attitude shifts from affiliative to aloof. Lunches with the old gang and visits to the local watering hole dwindle. When the new boss does attend these gatherings, the conversation is dominated by concerns with status, issues, plans. A silent wall forms between the executive and former co-workers. He thinks they take advantage of his friendliness. They think he is changed, distant, carried away with his own importance. Eventually, the term ``incompetent'' is tossed around. It is an unpleasant turn of events _ the fate I sought to avoid by evading promotion. Faking it I was successful for several years. The Peter Principle offered advice to those who felt they had achieved the last level of competence. The prescription suggested that the person who did not want to be promoted should pretend to be unfit for the next level. You do things that will make your managers question your competence, while your peers and associates still think you are effective. It seemed like a foolproof strategy to me. But now I had to face up to the fact that my strategy had failed and I would have to make the best of it. My new boss, the systems manager, had been recently promoted himself. He was the direct type _ direct to a fault. He started out telling me that I was not his first choice for the job but my seniority had been an unavoidable factor in his decision. I was sure that this was just a pep talk to keep me on my toes in my new position of authority. The systems manager told me that I had inherited some ``problem people.'' My first problem was to try to figure out which ones they were. They all seemed to understand what they were supposed to do, and they kept turning in great work. Oh sure, there were a couple of weirdos on that team, but I figured, heck, this is systems, not sales. My incompetence as a manager really came to the surface when I discovered that each of these people knew more about programming than I did. In fact, the only contribution I could make was to hammer away at the project milestones and restate the goals and deliverables in simple terms. All I had to do was point them in the right direction. Soon I had gotten into the habit of leaving them alone unless someone wanted to chat or show me what he had been up to lately. These folks were like kids, so proud of themselves. Helpful feedback They were getting results, too. The users were always calling and telling me what a great job we were doing. I passed these remarks on to the systems manager, but he ``wasn't interested in the opinions of nontechnical people.'' He was much more concerned with the fact that one of my status memos had three typos, and how did that make us look? My management consultant friend had been dead wrong. I was not doing fine. As a manager, I was incompetent. Sure, good things were happening, but I wasn't making them happen. It was them, not me. When the vice-president complimented me in writing, my systems manager hastened to point out that the team had done all the excellent work, and I should not acquire a big head. On my next review, the successes of my team were attributed to having an exceptionally competent staff. My accomplishments as a manager were deemed ``adequate.'' Most of the review was devoted to a discussion of the systems manager's views on standard methodologies and the leadership role of MIS in the strategic planning of the company. It was clear that I had failed to measure up to his standards of excellence. He thought I should become more technical so that my programmers wouldn't be able to fool me, and that I should become more demanding on my team members and be more influential over my peers. He also noted that I had to be prodded to turn in monthly status reports on time. Moreover, I had shown what he called disdain for traditional systems development methodologies. Although the systems seemed to work and the users were happy, many of my team members were remiss in ensuring that the documentation group was provided with the latest commented code. My systems manager was not amused by the sign on one programmer's door that read, ``Documentation is for sissies.'' A pushover In time, I became convinced that I had failed to grasp the essence of managing people in a systems development environment. I was too trusting, too easygoing. And although the results seemed valuable to me, my manager was clearly unimpressed. Eventually, our relationship deteriorated to the point at which I had to look for another job. I got a lucky offer from a big high-tech company as a project manager. I decided not to repeat the mistakes of the past. This time, my neighbor Joe advised me on my wardrobe. I got a couple of dark suits, a power tie and expensive shoes. He made me buy all-white shirts. I became the first project manager to turn in a monthly status report. Sure, I inherited a few people problems, but I straightened them out fast. The programmers took some technical courses, and there was a lot of turnover because of my high standards. My boss thought I was doing fine, but the users were unreasonable. They just didn't seem to realize the importance of good, readable documentation and flowcharts. They kept calling to ask me why their systems weren't done yet. But I figured, heck, you can't satisfy everybody. By Dennis Noonan; Noonan is a free-lance writer based in Wellesley, Mass. He was formerly a project manager at a minicomputer maker in the Boston area. <<<>>> Title : Books in brief Author : CW Staff Source : CW Comm FileName: books Date : Jan 30, 1989 Text: BOOKS IN BRIEF Disaster Recovery Planning By Jon William Toigo Managing risk _ or catastrophe, in which risk turns into disaster _ in information systems, complete with a pullout flowchart showing emergency decision-making at each stage of a disaster recovery. Hardcover, 267 pages, $45, ISBN 0-13-214941-9, by Prentice Hall, Englewood Cliffs, N.J. DB2: Maximizing Performance of Online Production Systems By W.H. Inmon An experienced writer on databases tells how to create DB2 structures, design and load DB2 tables, run applications and maximize performance. Paperback, 369 pages, $39.95, ISBN 0-89435-256-3, by QED Information Sciences, Inc., Wellesley, Mass. IBM'S Local-Area Networks By W. David Schwaderer A senior programmer in IBM's Storage Systems Strategy and Architecture Development offers a straightforward guide to IBM Personal Computer LAN implementation. Paperback, 294 pages, $29.95, ISBN 0-442-20713-1, by Van Nostrand Reinhold, New York. Network World's Teletoons The editors of Network World present the comic side of communications in the teletoons drawn by Phil Frank and Joe Troise, the weekly newspaper's regular cartoonists. Paperback, 52 pages, $9.95, by Network World, Framingham, Mass. The Computer Virus Crisis By Philip Fites, Peter Johnson and Martin Kratz The prevention and cure of computer viruses _ as timely a subject as there is currently in the computer industry _ is covered in this very readable short paperback. Paperback, 171 pages, $19.95, ISBN 0-442-28532-9, by Van Nostrand Reinhold, New York. Publishers wishing to have their books considered for review or excerpting can direct books, prepublication galleys, press releases, catalogs or other information to George Harrar, Features Editor, Computerworld, P.O. Box 9171, 375 Cochituate Road, Framingham, Mass. 01701. <<<>>> Title : Project Da Vinci: A medic Author : CW Staff Source : CW Comm FileName: uic55 Date : Jan 30, 1989 Text: CHICAGO _ Leonardo da Vinci, the artist and inventor of the Italian Renaissance, was thought by some contemporaries to be diabolically morbid. He spent a good deal of time not drawing from life but instead sketching cadavers. His painstaking research of the dead, however, has brought tangible benefits to the living _ Leonardo's work formed the basis for modern medical illustration. And recently, Da Vinci the genius has inspired Project da Vinci, a $2 million program at the University of Illinois at Chicago (UIC) that is building a three-dimensional database containing vast amounts of information on the human body. The data, which is stored on a Control Data Corp. Cyber 930 mainframe, will contribute to two goals: the synthesis of a ``Standard Man,'' or composite of many different body types, and the effort to identify missing children by predicting how their bodies and facial features age. ``There's data specific to age, race and sex, so that we can summon up a picture of a human being across the range of the lifespan,'' said Lewis Sadler, a medical illustrator who heads the university's Department of Biocommunication Arts. The data accumulating in the Cyber 930 is based on measurements of dozens of cadavers donated to medical research. As many as 100 cadavers may eventually be used to create the database. The work parallels similar efforts under way at the University of Colorado and the University of Washington. According to UIC officials, the project is funded by such vendors as CDC, AT&T, Du Pont Co. and Procter and Gamble Co. Although it is not expected to be complete for several years, the Standard Man database will allow users to summon up descriptions of human beings for medical and commercial applications. Sadler said he anticipates practical applications in the areas of sports and rehabilitative medicine, clothing and shoe design, medical instruction and the engineering of military instruments. The centuries-old art of medical illustration has not yielded to computerization until recently, according to Sadler. ``For hundreds of years, people have talked about humans as belonging to a single species,'' he explained. ``But researchers ended up specializing in smaller and smaller areas of study, and nobody was looking at the whole picture.'' The old-fashioned way Until now, medical artists such as Sadler and his 12 colleagues on the UIC faculty were forced to render all illustrations by hand. For example, university illustrators have aided police in ``aging'' pictures of missing children in the hope of recovering them from noncustodial parents. The aging technique, which relies on the fact that facial dimensions change fairly predictably over time, has helped recover 24 of 81 children during the last three years. However, it is extremely painstaking work. ``It used to take 20 hours to `age' one child,'' Sadler reported. ``Now, we can do it in minutes.'' Hand-drawn work also suffers from some inaccuracy because of human error. ``We felt we were very accurate if we were 87% accurate,'' Sadler said. Not only does the use of computers increase accuracy, but data gleaned from cadavers is also far more precise than human dimensions derived from CAT scans or nuclear magnetic resonance (NMR) techniques, university researchers say. CAT and NMR scans tend to highlight bony structures and cartilage but do little to show soft-tissue structures. The examination Rather than scanning a body electronically, UIC researchers obtain a physical slice of a cadaver measuring one millimeter thick. This is accomplished by first displacing the water in the body with plastic, which gives the tissue sufficient rigidity to be sliced. The slice is then X-rayed, and the resulting image is stored digitally. Once entered into the Cyber 930 system, the Standard Man data can be accessed by a Cyber 910 graphics workstation, several AT&T microcomputers and an AT&T 3B 40000 minicomputer. All of these computers run AT&T's Unix System V and can exchange their multimegabyte files on a common network spanning two adjoining labs. A single 3-D image of a human body requires 2.2G bytes of memory. Because many computers are obtained through vendor grants, UIC's computing philosophy is one of flexibility. ``We want to maintain a multivendor environment,'' said Thomas Prudhomme, a fund-raising official at UIC who assists the university's chancellor in fund-raising ventures. ``We want to provide any faculty member with the computer platform of his choice.'' Sometimes platform choices are limited by which vendors choose to donate _ or discount _ their computer systems for use at the university. In the case of Project Da Vinci, AT&T's donation of more than $1 million in hardware and software products dictated the use of Unix System V. But funds from other grants were used to procure additional computers, Prudhomme said. By Jean S. Bozman, CW staff <<<>>> Title : DEC's desktop helps autom Author : CW Staff Source : CW Comm FileName: decrad Date : Jan 30, 1989 Text: Last week's torrent of desktop hardware and software unleashed by Digital Equipment Corp. left many scratching their heads. But as users begin to make sense of the announcement, some may find the products and capabilities that were announced add new functionality to systems already in use. The field of radiology is one that will benefit from DEC's new products. DEC has long offered application software to automate radiology departments in hospitals, but last week's announcement supplies the final pieces that will allow text and medical images to be integrated for the first time, according to DEC officials. Specifically, the pieces are Decwindows software, the Vaximage Application Services software tools and the desktop workstations in conjunction with Decrad, DEC's radiology software. Decwindows will be included in Release 5.1 of VMS, due out next month. However, a handful of early support customers have already been at work building systems. A long way to go Dr. Gilbert Jost, Chief of Diagnostic Radiology at The Mallinckrodt Institute of Radiology in St. Louis, said that, using the new DEC products and Decrad, his department has been able to integrate text and image in test demonstrations. Currently, images are mainly stored on film in the radiology department. Although he views the ability to integrate image on the same platform with text an important step forward, Jost acknowledges there is still a long way to go. The DEC products reportedly will be integrated with other DEC or non-DEC components to form a picture archiving and communications system that stores, displays and distributes text- and image-based patient information. For instance, a radiology image could be displayed in one window while a text-based record containing patient information could be displayed in another. An underlying part of this technology is software that converts images from a radiology-specific machine such as a CAT scanner to DEC Document Interchange Format (DDIF) protocols. By reducing text and image data to a common data type, the image-based data can be transmitted from the radiology machines to an Ethernet network of general-purpose computers. The conversion software is the result of a joint effort of DEC and Siemens Medical Systems, Inc., a subsidiary of Siemens AG. Although general-purpose hardware will not replace radiology-specific machines, the common data format and DEC's Compound Document Architecture allow the lower cost workstations to do some of the work that used to require special-purpose hardware. The Vaxstation 3100, DEC's new entry-level workstation, and the Vaxstation 3520 and 3540 workstations will be the most cost-effective models for a radiology system, according to the company. The Decrad software _ which runs solely on VMS operating systems _ is priced from $21,000 to $100,000, depending on the hardware on which it runs. Although situations will vary, the company estimates that a sophisticated radiology department could be automated with a DEC system, including workstations and software, for about $500,000. By Amy Cortese, CW staff <<<>>> Title : Cincom plans Directions c Author : CW Staff Source : CW Comm FileName: 23soft Date : Jan 30, 1989 Text: Cincom Systems, Inc. will hold its next Directions executive conference from Feb. 21-24 at the Hotel Inter-Continental in Miami. The conference is titled, ``Directions in Manufacturing, Keys to World Class Performance in 1990.'' For additional information, contact Cincom at 800-543-3010. Ardent Computer Corp. said it added four computational fluid dynamics software packages to its Tital line of graphics supercomputers. The programs include the following: Omniplot and Usaero, both from Analytical Methods, Inc. in Seattle; Phoenics from Cham Ltd. in London; and Fidap, a flow solver from Fluid Dynamics International, Inc. in Evanston, Ill. Business Systems Resources in Waltham, Mass., said it will adapt its Advance software to IBM's Systems Application Architecture (SAA) guidelines. Advance supports the information needs of college and university alumni and develop- ment organizations and other fund-raising requirements. Advance/SAA will be functionally comparable to Business Systems Resources' current Digital Equipment Corp. VAX and IBM/Cullinet Software, Inc. versions of Advance. Sybase, Inc. recently opened a Canadian subsidiary that is expected to be headquartered in Toronto. Sybase develops relational database management systems designed for on-line applications. Vista Financial Systems, Inc. won a contract with American Savings Co. in Omaha to provide on-line data processing services. Under the contract, the bank will install the Vista Financial Terminal System, which is a personal computer-based teller and host system. IXI Ltd in Cambridge, England, a developer of X Windows standard-related software and services, said it will tailor its desktop manager program, called X.desktop, to support the Open Software Foundation as well as AT&T Open Look style guides. Compusystems, Inc., a developer of banking software specializing in mainframe collection and recovery, said Household Finance Corp. selected Compusystems' The Tracker on-line collection system. Household Finance will use the software package to handle the collection needs of its 800 consumer finance offices located throughout the U.S. <<<>>> Title : Radstone Technology has a Author : CW Staff Source : CW Comm FileName: hwradsto Date : Jan 30, 1989 Text: Radstone Technology has announced the 68-33, a Motorola, Inc. 68030-based board for VMEbus multiprocessing applications. The product reportedly provides up to 2M bytes of quad-ported, no-wait state random-access memory and operates at speeds up to 33 MHz. According to the vendor, the board is compatible with a variety of operating systems and software including VXCEL, a proprietary operating environment based on VRTX32, and Unix. The 68-33 is priced from $4,595. Radstone, 1 Blue Hill Plaza, Pearl River, N.Y. 10965. 800-368-2738. <<<>>> Title : Sky Computers, Inc. has i Author : CW Staff Source : CW Comm FileName: hwskycom Date : Jan 30, 1989 Text: Sky Computers, Inc. has introduced a Motorola, Inc. VMEbus version of its Warrior II array processor for Sun Microsystems, Inc. workstations. The Sky Warrior II/S can reportedly execute complex algorithms up to 27% faster than the company's previous Warrior product for Sun-3 and Sun-4 platforms. Designed specifically for engineering and scientific applications, the board also includes a software library of vector subroutines that can be executed from Fortran or C language programs. The Sky Warrior II/S costs $11,900. Sky Computers, Foot of John St., Lowell, Mass. 01852. 617-454-6200. <<<>>> Title : Intel Corp. has introduce Author : CW Staff Source : CW Comm FileName: hwintelc Date : Jan 30, 1989 Text: Intel Corp. has introduced a set of boards developed to combine the capabilities of its 80386-based processor, Multibus II multiprocessing architecture and DOS-compatible software. The Multibus II PC Subsystem reportedly consists of a 16-MHz 80386 CPU board, a peripheral companion board containing a hard-disk controller and IBM Video Graphics Array controller and an adapter board that allows users to add standard half-length IBM Personal Computer XT and full-length PC AT bus boards. Scheduled for availability in the second quarter, the products are priced from $195 to $4,700. Intel, P.O. Box 58065, Santa Clara, Calif. 95052. 800-548-4725. <<<>>> Title : Simpact Associates, Inc. Author : CW Staff Source : CW Comm FileName: hwsimpac Date : Jan 30, 1989 Text: Simpact Associates, Inc. has released its Real-Time Clock (RTC) for Digital Equipment Corp. VAXBI-class computers. The RTC is a programmable real-time clock option that is said to provide high-resolution, precise interval timing. Features include a 32-bit-wide counter, and up to 512 events can be timed, counted and stored for subsequent retrieval by the host application program, the vendor said. The RTC is priced from $4,990 to $5,890. Simpact, 9210 Sky Park Court, San Diego, Calif. 92123. 619-565-1865. <<<>>> Title : An optical host adapter s Author : CW Staff Source : CW Comm FileName: hwqualog Date : Jan 30, 1989 Text: An optical host adapter specifically designed for Digital Equipment Corp.'s Unibus systems has been announced by Qualogy, Inc. The quad-wide QLC-1100 is reportedly compatible with DEC hardware and software. It allows the optical storage system to replace any tape-storage system without modifying the applications software, the vendor said. The QLC-1100 costs $2,395. Qualogy, 1751 McCarthy Blvd., Milpitas, Calif. 95035. 408-434-5200. <<<>>> Title : Burr-Brown Corp. has intr Author : CW Staff Source : CW Comm FileName: swburrbr Date : Jan 30, 1989 Text: Burr-Brown Corp. has introduced a transaction processing software package designed to add real-time, networked data collection capabilities to Digital Equipment Corp.'s VAX/VMS-based computer systems. The TMV9000 tool manages a network of Burr-Brown data collection devices, the vendor said, and provides a straightforward interface between data collection systems and other manufacturing applications software. A 1,600 bit/in. 9-track tape or TK50 cartridge is included. Pricing is dependent on individual system configuration. Burr-Brown, P.O. Box 11400, International Airport Industrial Park, Tucson, Ariz. 85734. 602-746-1111. <<<>>> Title : An end-user support infor Author : CW Staff Source : CW Comm FileName: sw4st Date : Jan 30, 1989 Text: An end-user support information system for IBM IMS/DC and CICS environments has been announced by 4.ST North America, Inc. TIMS 1.3.1. is an on-demand Help, support and documentation facility that can be accessed from any IMS or CICS transaction, according to the vendor. It was developed to replace manuals and other printed documents with on-line, real-time updates and to reduce end-user training time by making applications easier to use. The software is priced from $6,000 to $125,000, depending on configuration. 4.ST North America, Suite 412, Oakwood Corporate Center, 401 Whitney Ave., Gretna, La. 70053. 504-366-9944. <<<>>> Title : Vista Financial Systems h Author : CW Staff Source : CW Comm FileName: swvistaf Date : Jan 30, 1989 Text: Vista Financial Systems has introduced an on-line, real-time integrated retail banking system. The Advanced Financial System (AFS) software is available for fault-tolerant systems and is said to organize all account, financial and demographic information by customer name rather than by account type. AFS can operate with on-line services or in-house, turnkey systems. Single-license fees range from $400,000 to $700,000, depending on the customer's computing configuration and options purchased. Vista Financial, Suite 400, 1807 Park 270 Drive, St. Louis, Mo. 63146. 314-878-4210. <<<>>> Title : Computerline, Inc.'s Plan Author : CW Staff Source : CW Comm FileName: swplantr Date : Jan 30, 1989 Text: Computerline, Inc.'s Plantrac Project Management System is now available for Apollo Computer, Inc. workstations running the AT&T Unix System V operating system. Developed for project managers and planning engineers, the software reportedly offers day-to-day or hour-to-hour scheduling and management of both large and small products. Features include computer-aided design techniques and report-writing capabilities. According to the vendor, the Plantrac Project Management System for Apollo and Sun Microsystems, Inc. workstations is priced from $5,995 for a three-user license. Computerline, P.O. Box 308, 52 School St., Pembroke, Mass. 02359. 617-294-1111. <<<>>> Title : The Dylakor Co. division Author : CW Staff Source : CW Comm FileName: swdyalko Date : Jan 30, 1989 Text: The Dylakor Co. division of Sterling Software, Inc. has announced enhanced versions of Dyl-280 and Dyl-280 II, the company's fourth-generation language information management software package. Designed to run on IBM and compatible mainframes, the product reportedly includes additional keywords in the print function and extended arithmetic capabilities. Dyl-280 Release 5.5 is priced at $13,000 for VSE or VM environments, $16,000 for MVS environments. Dyl-280 II Release 2.5 is priced at $17,000 for VSE and VM environments. An MVS version costs $20,000. Dylakor, P.O. Box 2210, Chatsworth, Calif. 91313. 818-718-8877. <<<>>> Title : Macworld Expo is revisite Author : CW Staff Source : CW Comm FileName: mac5 Date : Jan 30, 1989 Text: SAN FRANCISCO _ Last week's Macworld Expo marked the one-year anniversary of Apple Computer, Inc.'s much-publicized liaison with Digital Equipment Corp. That marriage, announced at a hastily assembled press conference that kicked off last year's Macworld Expo, has yet to bear fruit, despite acknowledgment that there was synergy between Apple's Macintosh personal computer and DEC's VAX minicomputer. ``There hasn't been a whole lot concrete that's come out of the relationship,'' said Nina Burns, vice-president of Infonetics, Inc., a Santa Clara, Calif.-based market research firm. Speculation after the announcement focused on the possibility of the two companies jointly developing products or even the possibility of DEC peddling Macintoshes as terminals to its VAX minicomputers. Neither has happened, and it now seems unlikely that DEC will be hawking Macs following its recent introduction of its own line of desktop systems. Third-party developers are heartened by the news that neither Apple nor DEC has introduced competing products. ``It was an arrangement of convenience,'' said Steve Nelson, marketing director at Kinetics, Inc., a third-party developer of Mac-to-VAX connectivity products. ``There was so much hoopla a year ago that made it almost seem like they were merging. ``Not a lot has changed,'' Nelson said. ``People were buying Mac-to-VAX connectivity products, and they're continuing to do that.'' Third parties are awaiting developer's guidelines for future-generation products. The Apple-DEC partnership overshadowed a myriad of products introduced by Apple third parties as well as three printers introduced by Apple itself. Apple and DEC's relationship is not the only disappointment from last year's Macworld. It also marked Lotus Development Corp.'s reentry into the Macintosh product world with a revamped version of its original Mac software package called Jazz. Modern Jazz, an integrated software program that debuted three years after the introduction of Jazz, was supposed to be free of the problems that sunk the original program while also adding new capabilities. However, the updated version has proved to be an even bigger failure than its predecessor. According to Infocorp, a Cupertino, Calif.-based market research firm, an estimated 115,500 copies of Jazz were sold _ well below Lotus' expectations. Modern Jazz never shipped. Months after its Macworld debut, it was scrapped amid reports that Lotus could not resolve some technical problems in the program. Other products that debuted at last year's Macworld have fared better than Modern Jazz. Apple introduced three laser printers to replace its older Laserwriter and Laserwriter Plus models. The products were an attempt to renew Apple's commitment to the desktop publishing market. Powered by a new Canon USA, Inc. engine, the new printers were said to offer up to four times the speed of Apple's older products and three times the print life. The printers have been well received, according to Robert Fennell, an industry analyst at Dataquest, Inc., a San Jose, Calif., market research firm. They boasted features like improved paper-handling capabilities and more fonts, he noted. However, the products were affected by a shortage of memory components. The entry-level model, the Laserwriter SC, comes standard with 1M byte of random-access memory. The mid-range model, the Laserwriter NT, and the high-end product, the Laserwriter II NTX, are sold with 2M bytes of RAM as standard features. The dynamic RAM scarcity forced Apple to raise product prices last September. However, Fennell said Apple was able to maintain its share of 10% to 15% of the laser printer market in 1988. By Julie Pitta, CW staff <<<>>> Title : End users navigate hard d Author : CW Staff Source : CW Comm FileName: magellan Date : Jan 30, 1989 Text: CAMBRIDGE, Mass. _ Some companies sell shell programs that shield users from the difficulties of Microsoft Corp.'s MS-DOS. Others sell tools that search hard disks for particular pieces of information. And others pitch packages that help users back up data in case of power failure. Beginning in April, Lotus Development Corp. will offer Magellan, a unique $199 package that provides file searching capability with a few twists. For instance, once a file is located, the user can read the file using its native file format. This will allow users to view, for example, 1-2-3 worksheets, even though they do not have a copy of 1-2-3. In addition, Magellan can automatically launch the user into the application that created it to perform further editing, printing or transmission. Magellan also implements ``fuzzy'' searching, allowing users to key in items that approximate what is actually on the disk. ``There is very little here that is actually new,'' said Lotus Vice-President Ed Belove. Instead, it is the combination of technologies that make Magellan unique, Belove said. Magellan is also handy for gathering information from a variety of files and putting it together for a report. For example, a user can search a disk for all files that pertain to IBM Personal Computer pricing and pull out the relevant portions of each file to see trends or write a memo to a dealer asking for better terms. The product is clearly aimed at today's character-oriented environment and effectively reads files from Lotus' 1-2-3, Symphony, Agenda and Manuscript, as well as Wordperfect Corp.'s Wordperfect, IBM's Displaywrite, Ashton-Tate Corp.'s Dbase and Multimate, Microsoft's Word and Micropro International Corp.'s Wordstar. The Magellan system reportedly works less well with graphically oriented programs that use bit-mapped images, such as Microsoft's Windows applications. However, users of these types of programs are able to read the text contained in these bit-mapped files, according to Lotus officials. By Douglas Barney, CW staff <<<>>> Title : An Apple for the '90s Author : Julie Pitta Source : CW Comm FileName: col2 Date : Jan 30, 1989 Text: As Apple faces the challenge of evolving Macintosh technology, it does so with a team of executives who are learning to love the Mac. Apple in 1989 bears little resemblance to the rag-tag company founded in a garage by Steve Jobs and Steve Wozniak. The engineers still wear T-shirts and jeans, and you can still find inflatable beach toys in their cubicles. It's the look in the executive offices that has changed. There, you'll find a group of gray-templed, blue-suited executives who wear blue jeans on Fridays only. The transformation has been by design. Apple Chairman and Chief Executive John Sculley realized that Apple needed to change its image if it was to be taken seriously by Fortune 1,000 customers. He realized that a blue suit feels more comfortable talking to another blue suit when he's doing business. The departure of Apple Senior Vice-President of Sales Chuck Boesenberg represents the latest in what has been a series of exits by Apple veterans. Most notably, Del Yocam, one-time Apple veteran and Chief Operating Officer, has announced his resignation, effective later this year. Debbie Coleman, Apple's chief financial officer, has said she will take a leave of absence, eventually returning to Apple in a position of less responsibility a few months from now. Like Yocam, Coleman is an Apple veteran, originally hired by cofounder Steve Jobs as controller for his favored Mac division. Coleman's sabbatical is attributed to health problems. Evidently, the stress of her career has taken its toll. However, she insists that her future is still with Apple. Eventually, she hopes to head an Apple spin-off like Claris. Yocam's departure is not as easy to explain, although he insists that the decision is his own. In one of Apple's many reorganizations last year, Yocam took a backseat to newer faces. Allen Loren, a former MIS executive at Signa Insurance, who at the time of the reorganization had been on board for less than a year, became president of Apple USA. Yocam was given the Apple Education division. Boesenberg leaves at a time when the industry is rife with rumors that Loren is handing out pink slips to many Apple veterans. Boesenberg will become a senior executive at Mips Computer Systems, Inc. in March. The announcement of his resignation comes only weeks after John Scull, who headed Apple's desktop publishing group, left the company to become president of Apple third-party software developer, Macromind. The change at Apple is not uncommon. Leaders tend to hire and promote in their own image. Like Apple Chairman Sculley, Loren is a member of the East Coast establishment. Replacing these old-timers are a group of seasoned executives. However, their experience is with Microsoft MS-DOS-based systems. Gerry Malec, Apple's vice-president of business marketing, and Donald Casey, vice-president of networking, are both former IBMers. These and other Apple executives will be responsible for moving the Mac forward. Jobs had a point when he called the Mac ``a technology of the '80s.'' The Mac's operating system sorely needs to be updated: It lacks the multitasking capabilities that OS/2 with the Presentation Manager will offer. OS/2 with the Presentation Manager takes the Mac's advantage _ a graphical user interface _ and does it one better with more power. The challenges are formidable. The question remains whether a team of new Mac converts can restore Apple's reputation of being a technology leader. By Julie Pitta; Pitta is Computerworld's West Coast senior correspondent. <<<>>> Title : Esber stands firm behin D Author : Douglas Barney Source : CW Comm FileName: esber Date : Jan 30, 1989 Text: Late last year, Ashton-Tate Corp. stunned the personal computer world when it sued Dbase cloner Fox Software, Inc. for copyright infringement. The firm must have known that suing a vendor of a popular, high-performance database management system would create a serious backlash. Simply put, customers like price competition and product choices and do not like lawsuits that seem to be aimed at stifling competition. Complicating matters, Dbase author C. Wayne Ratliff has claimed that Dbase was derived from a DBMS that is essentially in the public domain and not proprietary to Ashton-Tate. But vendors are not customers and clearly have a different set of goals, such as market share and return on investment. It is this very different set of goals that has prompted Lotus Development Corp., Apple Computer, Inc. and, recently, Ashton-Tate, to sue firms that produce software too much like the original. Computerworld Senior Editor Douglas Barney spoke with Ashton-Tate Chairman Ed Esber, who defended his company's actions and explained his organization's rationale for the lawsuit. You have sought to protect the Dbase language from infringement, but so far, no court has ruled that languages are protectable. There is no legal precedent to indicate that the language can't be protected. This is an integral part of the Dbase product. I do get upset when I read almost every article on this thing that makes a blanket statement that languages are not protected or aren't part of the copyright law. That is blatantly not true. The language is ours. We created it. A minute part of it does include other public-domain software that under law we can incorporate. I find it highly ironic that Wayne Ratliff _ who several years ago was trying to sell [Dbase] and become an employee of Ashton-Tate _ represented certain things and was given $15 million to $20 million dollars. Now as a competitor, he suddenly claims that the whole thing was derived from public domain and is not ours. I would be happy to take a refund if he is basically saying that he took it. What is the distinction between taking elements and concepts from others _ and using them to the benefits of users _ and outright stealing? The courts have ruled on a specific thing like that. Everybody clearly has the right to survey the competiton, listen to their customers and to incorporate, in some manner, capabilities derived from customer input. Fox has made several statements asserting that we ``copied a few features from them,'' and I have stated on many occasions that I hope they win. If they win saying that we took three features, they make our case that they stole 997 features. Hasn't Ashton-Tate borrowed some of the concepts and methodologies pioneered by others for use in Dbase IV? There are very few things that the Dbase clone vendors pio- neered that weren't common in either mini or mainframe databases of computer science for the last 20 years. For example, SQL is a language that you have taken. IBM specifically took action to place that in the public domain. There is nothing to say a language is not protectable. For the languages we are used to dealing with in the computer world _ Basic, Cobol, C _ actions were taken by the inventors or institutions that created them to purposefully put them in the public domain. We have always asserted that the language is an important element of our proprietary right, and we intend to protect it. Some poeple are giving us a hard time for not filing the suit earlier. We do not use the courts lightly. We took several actions in other manners besides the legal system to assert our rights, but nobody listened to us. What about Wordtech? Aren't they immune from a lawsuit because they sold you technology? We believe Wordtech is substantially overstating the effect of that release. We have provided that release only to Wordtech and to no one else. The release pertains only to Wordtech problems that existed in 1987 and subsequent versions of those products so long as they haven't been substantially modified. Dbase IV is not a part of that release. We are watching Wordtech very, very carefully to see what kind of products they bring out. What about the IEEE standards committee that is hoping to standardize the Dbase language? My statement to them is that if this committee wants to create their own database language, that is great. But since we believe Dbase is ours and protectable, they can't use the name Dbase and can't use the Dbase language. Because the court has not ruled on language protection, clone vendors face a period of uncertainty. Is that using the court to lock these people out of particular markets? We do not use the court system to in any way harm legitimate competition. Let's assume Ashton-Tate prevails. What is the benefit for the customer? The first time any two people file a lawsuit who in some manner appear to be competing, somebody sticks the consumer up, wraps them in an American flag and says prices are either going to go up or technology will not move forward. Arguing against either of these issues is like arguing against apple pie and motherhood. Users will benefit because firms that take risks are ultimately rewarded and will ultimately continue to take risks and bring new things to the market. <<<>>> Title : Atlanta tests laptops and Author : CW Staff Source : CW Comm FileName: homeless Date : Jan 30, 1989 Text: Health care agencies face nearly insurmountable odds in trying to medically treat the growing U.S. homeless population. This is namely because few street people regularly go to health clinics or attend the same one more than once. But a social service agency in Atlanta hopes to give the homeless a better chance of receiving vitally needed medical treatment by putting a squad of mobile health clinics, backed by personal computers, on the streets. Although the mobile unit program has not been fully implemented, community health service coordinator Bob Stokes said the computer technology has worked successfully in field tests. While there are no guarantees that the agency will eventually treat a majority of Atlanta's sick and injured homeless individuals, what was once impossible can now at least be attempted, thanks to computer technology, he said. The computerized tracking effort is the first of its kind, according to Stokes. ``Birmingham does data input on a computer, but we're the only ones I know of who are working with a phone line, modem and PC for on-line access to a homeless person's records,'' he said. Three mobile units _ each with a doctor, a social service case worker and a driver who assists with medical attention and medical supplies inventory _ bring health care to the curbside. Each member of the group assists in the street patient's treatment. The members then update the patient's record on their own PCs. The data is then transmitted to a central computer site. The community health services group's mission is to track and medically treat the 10,000 to 12,000 sick and injured homeless people in Atlanta. Cross-reference ``Because they are migratory, we have no clear census of how many of them are out there or how many are sick,'' Stokes said. ``We only know how many ill ones come in to be treated. Without a computer to cross-reference their files, we can't keep track of them.'' Stokes said that the homeless who arrive at the clinics most often need treatment for serious problems, such as respiratory illness, influenza, substance abuse and mental illness. To help monitor a homeless person's medical treatment and ensure that he or she receives the proper medication, the group uploads information from a database that cross-references each social worker's input. According to Bob Mead, president of Lifecare Technologies in Atlanta, these systems can be used by workers who do not know how to use a PC. ``Because the PCs are so simple to use, even a volunteer can come in and update a record,'' he said. ``It helps give their program a sense of continuity.'' Lifecare Technologies donated the technical and consultant resources to develop PC software applications for the health care agency. End users first enter data into The Write-Top from Linus Technologies, Inc. in Reston, Va., by writing with an electronic pen on an LCD screen instead of typing in information. The screen also displays data requested by the user. When the user writes on the 80- by 25-character screen, a transparent digitizer interprets the input and converts it into digital signals. The signals are then converted to ASCII to resemble input from a keyboard. The request or entry input is then uploaded to the home-base computer by modem over a private branch exchange telephone line. The medical files are stored in a database on a Hewlett-Packard Co. minicomputer. By William Brandel, CW staff <<<>>> Title : Graphics aid in software Author : CW Staff Source : CW Comm FileName: syscorp Date : Jan 30, 1989 Text: AUSTIN, Texas _ Bringing end users into the application development cycle too late can lead to costly misunderstandings of their needs. Conversely, bringing them in too early often leads to costly delays. Syscorp International's Microstep promises to speed application development and allow end users to become involved in the process virtually from the beginning. Microstep (``step'' stands for Specification to Executable Program) is a computer-aided software engineering product that produces executable C language programs directly from graphical specifications. The product is designed to improve software development productivity using four basic features: an interactive graphics design environment, automatic specification analysis, generation of executable code and production of high-quality technical documentation. Unlike conventional programming tools, which require the developer to describe the operation of an application in words, Microstep makes use of intelligent graphic symbols that enable the developer to draw the application. Microstep then automatically produces an executable program as well as system documentation directly from the complete specification. According to the company, C language programs generated with Microstep are 100% executable. ``In our initial use of Microstep, we completed a 160-hour application development effort in 10 hours,'' said Jules Ghedina, principal-in-charge of Peat Marwick Main & Co.'s national technical center based in Montvale, N.J. ``This initial high productivity, combined with the product's design and validation feature, makes it desirable for us to use Microstep on consulting engagements requiring PC application software development,'' he said. Syscorp and Peat Marwick recently signed an agreement that calls for the latter to provide implementation assistance, training and custom application development services for Microstep. An analyst or programmer can use Microstep's mouse-driven, graphic specification environment that features five sets of design tools to build data flow diagrams, layout screens and format reports as well as describe the application's computations and other activities. Resists inconsistencies Elements of a design specification reportedly can be copied and stored in a data dictionary for use in other specifications, helping to reduce design inconsistencies resulting from redundant development efforts. The Texas Water Commission in Austin has been using Microstep to develop stand-alone prototype applications. These applications will be used by individual end users in satellite offices to compile data related to water quality and hazardous waste at various sites, said John Wilson, manager of the applications development center. In a pilot program soon to be under way, end users will receive an application written with Microstep that will be used to tabulate and print data gathered during field inspections. However, end users will have to send the disks to commission headquarters on a monthly basis rather than upload the information electronically because Microstep is not designed for use on networks, Wilson pointed out. The lack of a networking version is a critical limitation to the product's viability for the water commission, he added. ``We have some other questions about it _ for instance, in a tutorial application we're working on _ but the folks at Syscorp have been responsive,'' Wilson added. In development A network version is in the development stage and will probably be available this summer, according to a Syscorp spokesperson. Microstep runs on an IBM Personal Computer AT or compatible equipped with 640K bytes of memory, a 20M-byte hard disk, DOS 3.1 through DOS 3.3, an IBM Enhanced Graphics Adapter or Hercules Computer Technology, Inc. video card and a Microsoft Corp.-compatible mouse. The company's suggested list price is $5,000. ``The price is probably a little high for its capability because it is basically a single-user product at this point,'' Wilson said. ``But being able to make unlimited runtime copies justifies the price for us. It meets the bill, as we think it will be used by hundreds of people'' in the satellite offices. By Michael Alexander, CW staff <<<>>> Title : Profit Technology, Inc. h Author : CW Staff Source : CW Comm FileName: micprofi Date : Jan 30, 1989 Text: Profit Technology, Inc. has announced the Pro/One Model 35 business computer. The unit is based on a NEC Corp. V20 processor running at either 4.77 or 10 MHz and can be configured with 640K bytes of memory. Options include an Intel Corp. 8087 math coprocessor, 3 - or 5 -in. floppy drives and 20M-, 40M- or 80M-byte fixed disks. A basic system with 256K bytes of random-access memory, one 1.2M-byte 5 -in. floppy drive and a monochrome monitor costs $795. Profit Technology, Pro/One Division, Suite 1441, 17 Battery Place, New York, N.Y. 10004. 800-223-4628. <<<>>> Title : Certiflex Corp. has relea Author : CW Staff Source : CW Comm FileName: miccerti Date : Jan 30, 1989 Text: Certiflex Corp. has released Version 5.0 of its Certiflexplus Client Write-Up system. The program has reportedly been enhanced to provide a 250% speed increase over the previous version. The Certiflexplus system also offers on-line Help and 13 period and data file conversions for all existing client files, according to the company. The Certiflexplus Client Write-Up 5.0 costs $995. Current users may upgrade for $245. Certiflex, 12920 Senlac Drive, Dallas, Texas 75234. 800-237-8435. <<<>>> Title : A program for sales and m Author : CW Staff Source : CW Comm FileName: mictechn Date : Jan 30, 1989 Text: A program for sales and marketing managers using IBM Personal Computer XTs, ATs and compatible systems has been announced by Technical Sales and Marketing Associates. The Sales Source Manager reportedly maintains separate databases for sales leads territories and addresses, advertising sources and product descriptions. Report- and label-generating capabilities are also included. The package is priced at $249. Technical Sales and Marketing, P.O. Box 8655, Fountain Valley, Calif. 92728. 714-968-9838. <<<>>> Title : Left Coast Software has e Author : CW Staff Source : CW Comm FileName: micleftc Date : Jan 30, 1989 Text: Left Coast Software has enhanced its check-writing and personal accounting program for IBM Personal Computer and compatible users. Exchequer Version 2.0, designed to automate the billing process in small businesses or offices, requires 230K bytes of available memory and one floppy drive. The software reportedly supports any printer that can handle continuous-feed checks and is priced at $49.95 plus $3.00 shipping and handling. Left Coast, P.O. Box 160601, Cupertino, Calif. 95016. 800-234-0554. <<<>>> Title : An occupational analysis Author : CW Staff Source : CW Comm FileName: micsophi Date : Jan 30, 1989 Text: An occupational analysis and job matching system designed for personnel departments, employment agencies and career counselors has been announced by Sophisticated Software Development, Inc. According to the vendor, Majic can generate employer listings, job orders, job listings, occupational analysis and client ability profiles. The software is priced at $2,000 per workstation. Sophisticated Software, Suite 220, 8625 S.W. Cascade Ave., Beaverton, Ore. 97005. 503-641-4900. <<<>>> Title : A business forecasting so Author : CW Staff Source : CW Comm FileName: micconce Date : Jan 30, 1989 Text: A business forecasting software package has been announced by Concentric Data Systems, Inc. Trendsetter Expert was designed for sales forecasting, expense projection and inventory planning, according to the vendor. The product reportedly works as an add-in with Lotus Development Corp.'s 1-2-3 Releases 2.0 and 2.01 and Symphony Releases 1.1, 1.2 and 2.0. A hard disk is required for operation. Trendsetter Expert costs $149. Concentric, 18 Lyman St., Westboro, Mass. 01581. 508-366-1122. <<<>>> Title : Tarbell Electronics has a Author : CW Staff Source : CW Comm FileName: mictarbe Date : Jan 30, 1989 Text: Tarbell Electronics has announced a database system that offers drawing and picture graphics as a field type, the company said. The Datasketch system reportedly includes numeric, character, data, multiline and sound charts and built-in art capabilities. The program requires IBM PC-DOS or Microsoft Corp. MS-DOS 3.00 or higher and an IBM Color Graphics Adapter, Enhanced Graphics Adapter or Hercules Computer Technology, Inc. display, the vendor said. The $99 introductory price includes sample programs and files and is not copy protected. Tarbell, Suite C, 1082 E. Artesia Blvd., Long Beach, Calif. 90805. 213-422-7081. <<<>>> Title : Software Publishing Corp. Author : CW Staff Source : CW Comm FileName: micautog Date : Jan 30, 1989 Text: Software Publishing Corp. and Autographix, Inc. have announced the start-up of the Autographix Overnight Slide Service for users of Software Publishing's Harvard Graphics software package. The service reportedly permits Harvard Graphics users to transmit files via modem to authorized Autographix Service Centers and receive 35mm slides, color overhead transparencies or color prints within 24 hours. The current charge for same-day service and remote 24-hour turnaround is $12 per color slide. Users of Harvard Graphics 2.1 can receive an overnight slide service kit free of charge by calling 800-548-8558. Autographix, 100 Fifth Ave., Waltham, Mass. 02154. 617-890-8558. <<<>>> Title : Group L Corp. has reduced Author : CW Staff Source : CW Comm FileName: micgroup Date : Jan 30, 1989 Text: Group L Corp. has reduced the price of its full-text retrieval program for IBM Personal Computers and compatible systems. Designed to transform individual PC files into free-form databases for easy searching, Memory Lane, formerly priced at $149, is now available for $99. The information management utility can reportedly locate text or numbers stored anywhere on a hard disk. Group L, 481 Carlisle Drive, Herndon, Va. 22070. 703-471-0030. <<<>>> Title : K-Talk Communications, In Author : CW Staff Source : CW Comm FileName: micktalk Date : Jan 30, 1989 Text: K-Talk Communications, Inc. has announced a graphics version of its mathematical editing software. Designed to allow users to construct math expressions for technical documents, Version 1.1 of Mathedit can output math equations in a .PCX graphics file, the vendor said. The product can be inserted into Wordperfect Corp.'s Wordperfect 5.0, Aldus Corp.'s Pagemaker and several other programs. The package runs on IBM Personal Computers and compatibles and costs $149. K-Talk Communications, Suite 100, 50 McMillen Ave., Columbus, Ohio 43201. 614-294-3535. <<<>>> Title : Bitstream, Inc. has annou Author : CW Staff Source : CW Comm FileName: micbitst Date : Jan 30, 1989 Text: Bitstream, Inc. has announced that it will release its entire typeface library for use with Adobe Systems, Inc. Postscript PDL-based typesetters driven by Apple Computer, Inc.'s Macintosh computers. According to the vendor, the Bitstream Type Library for Postscript will work with several typesetters, including Linotype's??? Lintronic and Compugraphic Corp.'s CG 9400-PS. The first fonts are scheduled for delivery in February. The library will be priced at $50 per font, with a minimum purchase of four fonts, according to the vendor. Bitstream, Athenaeum House, 215 First St., Cambridge, Mass. 02142. 617-497-6222. <<<>>> Title : DSI Micro, Inc. has expan Author : CW Staff Source : CW Comm FileName: micdsimi Date : Jan 30, 1989 Text: DSI Micro, Inc. has expanded its range of training programs for Wordperfect Corp. Wordperfect users to include Quick Course for Wordperfect for the Macintosh. The system is said to be especially suited for preparing short documents and incorporates four segments for teaching users the basics of the Wordperfect program. An Apple Computer, Inc. Macintosh Plus, Mac II or Mac SE with one 800K-byte floppy disk drive and 512K bytes of memory are required. The software costs $69 per unit and includes an outlined program guide. DSI Micro, 770 Broadway, New York, N.Y. 10003. 212-475-3900. <<<>>> Title : Working Software, Inc. ha Author : CW Staff Source : CW Comm FileName: micworki Date : Jan 30, 1989 Text: Working Software, Inc. has released a word processing package for Apple Computer, Inc.'s Macintosh machine. Called Quickletter, the product can be used either as an application or desk accessory, according to the vendor, and provides the user with several letter composition and formatting features, including a page preview function. The program requires 512K bytes of random-access memory and costs $124.95. Working Software, P.O. Box 1844, Santa Cruz, Calif. 95061. 408-423-5696. <<<>>> Title : Crate Technology, Inc. an Author : CW Staff Source : CW Comm FileName: miccrate Date : Jan 30, 1989 Text: Crate Technology, Inc. announced it has expanded its line of internal hard disk drives for Apple Computer, Inc. Macintosh systems. The company's Innercrate series now includes a 600M-byte drive that is compatible with the Macintosh II. Dubbed the Innercrate 600, the unit offers an average access time of 16.5 msec, the vendor said, and is priced at $3,845. A 155M-byte tape backup system for the Macintosh was also introduced. Tapecrate 155 reportedly backs up files at 7M byte/min and offers on-screen Help information for all functions. It costs $1,049. Crate Technology, 6850 Vineland Ave., Building M, N. Hollywood, Calif. 91605. 818-766-4001. <<<>>> Title : Meta Systems Ltd. has ann Author : CW Staff Source : CW Comm FileName: micmetas Date : Jan 30, 1989 Text: Meta Systems Ltd. has announced a Microsoft Corp. PC/Windows computer-aided software engineering (CASE) tool. Quickspec reportedly allows systems analysts and designers to use their personal computers for entering, editing and reviewing project information in an object-oriented CASE repository. The program runs under any operating environment supported by Microsoft Windows, the company said, and requires 640K bytes of memory, a hard disk and a mouse. Quickspec is priced at $3,500 and is scheduled for February delivery. Meta Systems, Suite 200, 315 E. Eisenhower Pkwy, Ann Arbor, Mich. 48108. 313-663-6027. <<<>>> Title : A sales training and rein Author : CW Staff Source : CW Comm FileName: micprofi Date : Jan 30, 1989 Text: A sales training and reinforcement series has been introduced by Profit Technology, Inc. The Sales Bible Speedtutors are said to be DOS-based programs designed to increase selling potential by offering sales personnel short, continuous-feed reinforcement of key selling points. The software is available in 11 different versions, each concentrating on a specific sales technique. Each Speedtutor has a price tag of $19.95. Profit Technology, Suite 1441, 17 Battery Place, New York, N.Y. 10004. 800-223-4628. <<<>>> Title : DSI Micro, Inc. has annou Author : CW Staff Source : CW Comm FileName: micdsimi Date : Jan 30, 1989 Text: DSI Micro, Inc. has announced two computer-based training courses developed for Ashton-Tate Corp.'s Dbase IV database management program. Introducing Dbase IV: Mastering the Control Center was designed for the nonprogrammer and includes interactive practice sessions and step-by-step instructions. It is available in both 3 - and 5 -in. formats and costs $159. Dbase IV: The New Features reportedly gives experienced Dbase users an illustrated guide to the software's enhancements and modified commands and functions. Also available in 3 - and 5 -in. disks, the product is priced at $75. DSI Micro, 770 Broadway, New York, N.Y. 10003. 212-475-3900. <<<>>> Title : Genicom Corp. has unveile Author : CW Staff Source : CW Comm FileName: micgenic Date : Jan 30, 1989 Text: Genicom Corp. has unveiled its 3410X series of business-class serial matrix printers. The product line comprises five models: the 3410XLS, a high-speed data and word processing printer with continuous forms handling; the 3410XLQ, featuring a quiet enclosure; the high-speed color 3410XCQ; the 3410XBQ for bar-code printing; and the 3410XDQ with Digital Equipment Corp. LA210 emulation capabilities. Pricing ranges from $2,010 to $2,600, and shipments are scheduled for the first quarter. Genicom, Genicom Drive, Waynesboro, Va. 22980. 800-443-6426. <<<>>> Title : A nine-pin dot matrix pri Author : CW Staff Source : CW Comm FileName: micseiko Date : Jan 30, 1989 Text: A nine-pin dot matrix printer has been introduced by Seikosha America, Inc. Designated the SP-1600AS, the unit reportedly prints 160 char./sec. in draft mode and 40 char./sec. in near letter quality format. The device is compatible with Epson America, Inc.'s FX and IBM's Graphics printers. The SP1600A costs $329. Seikosha, 1111 Macarthur Blvd., Mahwah, N.J. 07430. 201-529-4655. <<<>>> Title : Consider fiber choices Author : Patricia Keefe Source : CW Comm FileName: plas Date : Jan 30, 1989 Text: One of the barriers to fiber-optic installations has always been cost. It is not exactly cheap, and it is not for the technically unsophisticated. But it is literally the (light) wave of the future. Especially given the promise of 100M-bit Fiber Distributed Data Interface (FDDI). Proponents of plastic fiber optics would like to alter this picture somewhat. They say they can cut the cost, offer greater tolerance and, most important, bring the camera down to the desk top. Why cope with a tangle of different wiring schemes when you can limit yourself to fiber, which promises flexibility, compactness and indifference to electrical disruptions? And why limit these benefits to campus backbones only? Well, maybe because some observers can't see any reason not to use good old unshielded twisted-pair wire to string together work groups and departmental networks. A more secure, tried-and-true option is thin Ethernet. On the other hand, Netronix, which introduced a plas- tic- and glass-fiber network last week (see related story page 55), claims to have overcome some of the drawbacks to plastic. It also maintains that plastic fiber is more secure and comparable in price to copper twisted-pair cable. Even so, ``It won't be really crucial to bring fiber to the desk top until the advent of super high-powered workstations,'' predicts Richard Cerny, president of Trellis Communications, a Salem, N.H.-based systems integrator that specializes in fiber-optics technology. With numerous campus installations under his belt, Cerny says he has seldom encountered a need to bring fiber to the desk top. ``And if we did, we'd bring glass to the desk, so that it could hook into the glass backbone,'' he says. This brings up another issue worth considering. Two glass-fiber cables of different diameters can be tied together more easily than can be done with plastic to glass, claims Bill Redman, an analyst at the Gartner Group, a Stamford, Conn.-based market research firm. Netronix would dispute this point, given that it is suggesting that users who want to travel greater distances than are possible over plastic fiber, hook into glass-fiber cable. The supplier also claims that plastic fiber's bigger core, through which it transports light, is less susceptible to interference from sources such as dust particles. Plastic fiber's ace in the hole seems to be the promise of lower costs and simple installation. But it is hard to compare the cost of a desktop configuration rigged up with plastic fiber _ such as Netronix seems to be targeting _ with a campus network wired with glass. Redman and Cerny also take issue with the premise that plastic is easier to deal with than glass. Both fibers are fairly flexible and both degrade as they age. Plastic may be less sensitive to rough handling than glass, but it is also more likely to discolor, affecting light transmission and bandwidth. And it may be more prone to kinking, which would scatter light signals. ``If plastic was cheaper and more reliable, then you'd see AT&T, Siecor and Corning Glass pulling it,'' adds a skeptical Redman. These vendors have invested heavily in glass-fiber optics. Not surprisingly, they have put considerable weight behind the emerging FDDI standard, which requires glass fiber. Another factor cementing glass fiber's popularity is IBM's recent decision to purchase 25% of PCO, an optoelectronics subsidiary of Corning Glass. IBM is committed to FDDI and has promised users an FDDI product by year end. ``IBM has an absolute need for this technology. You just can't build mainframe complexes without fiber anymore,'' says consultant Frank Dzubeck, president of Communications Network Architects. What this all boils down to, Trellis' Cerny says, is that a decision to go with plastic fiber is a nonstandard decision. This may be perfectly acceptable to a lot of users, especially if plastic fiber can be cleanly linked to other network media. As always, users need to make decisions based on factors such as their future directions (FDDI), how well suppliers are able to overcome plastic's shortcomings _ for example, shorter distances and smaller bandwidth _ and how entrenched glass or twisted-pair cable is in their systems. Plastic is certainly worth taking a look at. Just be on the lookout for any hidden or intangible costs. By Patricia Keefe; Keefe is a Computerworld senior editor, networking <<<>>> Title : Dayna offers DOS-to-Mac c Author : CW Staff Source : CW Comm FileName: dayna Date : Jan 30, 1989 Text: An OEM version of Novell, Inc.'s Advanced Netware could severely undercut Apple Computer, Inc. and market leader Tops at the low end of the MS-DOS-to-Macintosh connectivity market. Salt Lake City-based Dayna Communications, Inc. is scheduled to launch Daynanet, a server-based networking operating system bundled with an interface card, by the end of the first quarter. Dayna is claiming a marked price advantage over competitive products from Tops, a division of Sun Microsystems, Inc., and Apple. The pricing differential, according to Dayna, is considerable. Its comparison of an equivalent configuration among the three competitors breaks down as follows: To support 20 users in an Apple Localtalk-only network, Daynanet requires software, an interface card and an IBM Personal Computer AT clone for a total price of about $3,750. A similar Tops configuration costs $7,800 with a server included. Tops software does not require a dedicated server, but most 20-node networks use one. Apple's Appleshare approach, which utilizes a dedicated Macintosh II as a server, costs $8,100, claimed Lynn Alley, Dayna's cofounder and vice-president of research and development. The hardware platform is what constitutes most of the Appleshare configuration cost. Separately, Daynanet software costs $1,249 per server for Localtalk, or $1,749 for Localtalk and Ethernet. Tops software costs $249 per node, and Appleshare, which analysts said has not been burning up the sales charts, is $799 per server. Netware tie At the core of Daynanet is Advanced Netware. Dayna co-developed Netware for Macintosh with Novell, which shipped last month and reportedly provides basic file and printer services to the Macintosh on an equal basis with Microsoft Corp. MS-DOS-based computers. Daynanet is a specially tailored low-end version that supports Localtalk and Ethernet. Also, Daynanet file servers can be bridged via Localtalk or Ethernet cables to any of the estimated installed 300,000 Novell file servers. The interface card bundled with the operating system is Dayna's Daynatalk PC Card. The server supports up to four cards or four separate networks. Mainframe-to-Mac In a separate announcement, Relay Communications, Inc. in Danbury, Conn., unveiled a Macintosh-to-IBM mainframe file transfer product said to be both the first to display IBM mainframe file lists in Macintosh format and the first such link to incorporate Apple's Macworkstation development tool. Macworkstation developers can use Relay Baton to provide error-free message and file transfer to IBM mainframes running Relay/VM or Relay/TSO software. Relay's mainframe software serves an unlimited number of Macintoshes running Relay Baton, according to the vendor. Because Relay Baton takes advantage of the Mac interface, files reportedly can be transferred to and from the mainframe simply by pointing and clicking on file names. This saves Mac users from having to learn IBM mainframe commands and formats. Support for Apple's Multifinder enables Relay Baton to execute background file transfers. The product works asynchronously over telephone lines. Scheduled to be available in February, it runs on the Macintosh Plus, SE, II and IIX and costs $150 per unit. By Patricia Keefe, CW staff <<<>>> Title : Users catch on to LAN bac Author : CW Staff Source : CW Comm FileName: curl1 Date : Jan 30, 1989 Text: On a routine day at Ingersoll-Rand Co.'s Baxter Springs, Kan., division, about 20 people log onto a local-area network consisting of 40 personal computers. They write reports, create painstakingly intricate graphics on computer-aided design software, retrieve and add information from a database and then store it all _ without a thought about whether it will be there tomorrow. In the course of any given week, about 220M bytes of data are stored on two hard drives and a file server on the LAN. A system crash would spell disaster _ two years of data representing untold hours of work could be zapped into oblivion. Fortunately for these users, system supervisor David Hanon is in charge of thinking twice. And he would not dream of letting one day go by without backing up every bit of information. ``We're using our computer system to run our business,'' Hanon says. ``If you don't have data backed up, how would we run our business?'' Lack of preparation Good question, says Bill Redman, an analyst at Stamford, Conn.-based research company Gartner Group, Inc. ``I'd estimate well under 10% of LANs use a backup system,'' he says, basing the estimate on the fact that less than 5% of his clients in the Fortune 500 adequately back up LANs or PC drives. Behind this nonchalance is the fact that LANs are often purchased at the department level by people who understand the need to share information but not the importance of data security. Redman predicts an attitude change in the next few years. For example, some MIS departments are taking over responsibility for backing up LANs from users who will not do it. And corporations are learning to protect their data as an investment, asking about backup when they buy systems. In short, users are beginning to agree with suppliers like Gunner Bolz, president of Emerald Systems Corp. in San Diego, who says, ``Information on a network is more valuable than the equipment on which it resides.'' It is not as if users do not have a variety of options to choose from. Internal and external backup systems offer a wide range of memory for various PC configurations. Most support the major PC LAN products such as Novell, Inc.'s Netware and 3Com Corp.'s 3+. The typical price range is from under $1,000 to about $9,000. These systems often work while users sleep, and some store data measured in gigabytes _ 2G bytes is about one million typewritten pages _ on tape cassettes no bigger than the ones you slide into your car stereo. Hanon uses the VAST Device backup system from Emerald Systems. VAST's data transfer rate is 1.5M byte/sec. in burst mode and 250K byte/sec. in continuous mode. It also has an internal 256K-byte speed-matching buffer and provides from 250M to 2.2G bytes of storage. ``All you do is log on and then go home,'' he says. The system lets Hanon choose to back up daily or monthly as well as a specific day of the week. The system kicks in at a preset time at night and goes to work. An added bonus is the system's unattended operation, which means no lost time during work hours while the data is backed up, he says. The payoff The payoff for this diligence came when a 285M-byte hard drive crashed recently. Hanon's department merely waited for delivery of a new drive before restoring the backed-up data in a matter of hours. Another VAST user is Mark Hofius, a senior computer systems analyst at Allen-Bradley Co. in Ann Arbor, Mich. His division has 180 PCs and two servers hooked together on a Novell LAN. When a 175M-byte file server crashed at about 3 p.m. several months ago, Hofius tapped into the division's VAST system and had data restored by 10 p.m. ``If I wasn't backing up with the VAST, I'd have had a major problem,'' he says. In the case of Columbus, Ohio's, branch of the U.S. Postal Service, the need for LAN backup was cemented by numerous disk crashes. Bob Girardi, a data collection technician, said, ``We had been considering the [backup] option for some time, but that first disk failure was the [convincer].'' The Maynstream 60 enabled him to restore data after the second crash in minutes rather than hours. By Katy Gurley, Special to CW; Gurley is a Wellesley, Mass.-based free-lance writer specializing in high technology. <<<>>> Title : Can plastic cable nets cu Author : CW Staff Source : CW Comm FileName: plastic Date : Jan 30, 1989 Text: PETALUMA, Calif. _ ``I have one word of advice for you: plastics.'' Netronix must have been listening when Benjamin Braddock received this unexpected bit of counsel in The Graduate. That was almost 20 years ago, but to hear Netronix, that tip holds particular importance for would-be fiber-optic cable users. The network supplier last week introduced Fiberstar, which it claims is the first network to support both plastic- and glass-fiber cable. Plastic fiber, according to Netronix, will bene- fit users in the form of easier installation and troubleshooting, greater flexibility and lower cost. But several industry observers questioned the need for plastic fiber and its touted benefits. Regardless of which side of the debate you are on, the issue is already moot, they claimed, given that industry leaders such as IBM, AT&T and Digital Equipment Corp. have already standardized on glass fiber, which is required under the 100M-bit Fiber Distributed Data Interface standard (see story page 55). Netronix appears to be ducking that salvo by targeting direct connections to the desk top. Most glass-fiber networks function as campus or corporate backbones, with some installations linking departments between floors. 10M/bit Ethernet hot Taking the opposite tact, Netronix President Art Jopling suggested that plastic fiber has the potential to supplant copper twisted-pair cable in many applications. This will not be easy: Even though it is true that twisted-pair radiates electronic signals, demand for 10M bit/sec. Ethernet over unshielded, twisted-pair cable is hot. Undaunted, Netronix ticked off the following attributes of its 2M bit/sec. network: cheaper connectors; improved signal encoding; easier-to-see ``visible'' light rather than glass fiber's infrared light; low maintenance costs; extensive diagnostic and test features; an IBM Netbios emulator and Novell, Inc. Netware driver; and the ability to more tightly loop the cable in wiring closets. Fiberstar has two major components: PC Optical Fiber LAN Adapters and Optical Fiber Hubs. The adapters transform electrical signals into light pulses and back. The hub regenerates the optical signal, increasing the distances that can be traversed. The network supports 16 ports and reportedly interoperates with Ethernet, Starlan and broadband networks utilizing Netronix bridges. A Transmission Control Protocol/Internet Protocol package is optional. Available now, Fiberstar adapters start at $595; the 16-port hub costs $2,195. Netronix claims to have eliminated one drawback of plastic fiber _ limited ability to carry data long distances. Fiberstar will support spans of 500 feet between any two nodes. For longer distances, it accommodates glass fiber, which supports up to 6,000 foot distances. Glass- and plastic-fiber technology share some attributes, such as immunity to electrical noises and eavesdropping and flexibility. Both also have a tendency to degrade after a while. By Patricia Keefe, CW staff <<<>>> Title : Bridges, gateways open wi Author : CW Staff Source : CW Comm FileName: lanbridg Date : Jan 30, 1989 Text: WASHINGTON D.C. _ Local-area network bridges and gateways, hailed by industry observers as the next phase for established LAN installations, are expected to make a big splash in the upcoming Communications Network '89 event. LAN bridge shipments totaled $83 million to $100 million in 1987 in the U.S. and should have an annual growth rate in excess of 20% during the next few years, said Bill Redman, service director of local-area communications at the Gartner Group, Inc. in Stamford, Conn. The following bridge and gateway introductions are expected at Comnet '89, held here the week of February 6: Artel Communications Corp. in Hudson, Mass., will announce Manbridge, a 45M bit/sec. version of its Fiberway 802.3 bridge, to provide a high-speed link between Ethernet LANs. Scheduled to be available in early February, the bridge is said to provide two different types of LAN-to-LAN connection. First, it can connect multiple, geographically distributed Ethernet LANs over standard DS3 connections, as provided by AT&T and other carriers. Second, it can connect multiple 100M bit/sec. Fiberway LANs within a campus area over a 45M bit/sec. token- ring backbone running on coaxial cable. The latter connection can span up to 450 feet without repeaters, Artel said. Proprietary, high-speed LAN backbones such as Artel's are more suited to handling communications among multiple LANs than products based on current networking standards, Redman said. However, the real breakthrough for the market will be high-speed LAN-to-LAN connections based on the Fiber Distributed Data Interface standard, he added. Artel is also expected to announce an enhancement that is said to allow its Fiberway family to ``carry multiple T1 signals concurrently with Ethernet traffic,'' the company said. As a result, Fiberway can be used as a metropolitan-area network supporting both voice/data T1 links and LAN interconnections, Artel said. The product is scheduled for release this spring. Advanced Computer Communications Corp. (ACC) in Santa Barbara, Calif., is expected to introduce ACS 4100, a bridge said to connect two or more Ethernet LANs over a long-distance link. The 4100 reportedly can perform either as a protocol-independent bridge or as a router that provides more sophisticated connections between devices that use the same networking protocols. Computer Network Technology Corp. in New Hope, Minn., will be announcing expanded support for its Lanlord 8000 Series of Inter Processor Networking Gateway products, which are said to provide channel-based connections between mainframes and various devices on Ethernet LANs via Transmission Control Protocol/Internet Protocol. The new Lanlord Model 8100 is said to support IBM MVS and VM hosts, while the Model 8200 is said to support Digital Equipment Corp. Unibus or BI bus systems. Crosscomm Corp. in Marlboro, Mass., is expected to announce three token-ring LAN bridges. The first will connect multiple LANs, the second will connect a LAN to a 1.5M bit/sec. T1 long-distance link and the third will connect a LAN to a 56K bit/sec. long-distance connection. By Elisabeth Horwitt, CW staff <<<>>> Title : Infotron adds packets, jo Author : CW Staff Source : CW Comm FileName: infotron Date : Jan 30, 1989 Text: CHERRY HILL, N.J. _ Infotron Systems Corp. recently joined the bandwagon of T1 multiplexer vendors that are integrating packet-switching technology into their equipment. The ability to send packet-switched data over T1 circuits is a high priority for T1 vendors and their users, according to Frank Dzubeck, president of Communications Network Architects, Inc., a Washington, D.C., consulting firm. ``Packetizing makes more efficient use of T1 channels, allowing you to use less bandwidth for data and provide more bandwidth for voice, which is circuit-switched,'' he said. Two T1 market leaders, Unisys Corp. subsidiary Timeplex, Inc. and Network Equipment Technologies, Inc. (NET), are already providing such capabilities by integrating their switches with packet-switching equipment from their respective subsidiaries. Last January, NET and its subsidiary Comdesign, Inc. jointly announced the SPX Network Processor, which uses packet technology to handle multiple 9.6K bit/sec. transmission rates over the same 9.6K bit/sec. line, according to Comdesign Product Manager David Hofstatter. Linked to NET's IDNX T1 switch, the multiplexer significantly boosts the utilization of each slice of the T1 circuit-switched path, he added. NET also provides a product to manage both SPX and IDNX devices. Timeplex subsidiary Cygnus Computer Corp., acquired 2 years ago, provides ``the basis for our packet-switching line, Timepac,'' said Timeplex spokesman Gregory Langford. Cygnus packet switches and packet assembler-disassemblers can now send data over a 64K bit/sec. channel handled by Timeplex's Link T1 switch family, he added. Further integration of Timeplex's circuit- and packet-switching technologies is in the works, the firm said in a recent statement of direction. Users will be able to manage both types of products with the next version of Timeplex's network management system, Timeview, Langford said. On the other hand, another T1-switch market leader, Digital Communications Associates, Inc. (DCA), has no current plans to integrate packet and T1 switches, even though the company offers both types of products, a DCA spokeswoman said. Infotron, based here, announced that it has entered a joint development agreement with Netrix Corp. to integrate Netrix's packet-switching technology with Infotron's T1 multiplexers. Infotron will initially provide its own version of Netrix's CCITT standard-based X.25 products as part of its product line in March. By early next year, Infotron multiplexers should be able to carry data from Netrix packet switches and packet assembler-disassemblers, according to Infotron Vice-President of Engineering Stig Pierson. Co-developed Integrated Services Digital Network products should also appear around that time, he added. In addition, the two firms plan to integrate their network management systems, according to Pierson. Netrix, a Herndon, Va.-based vendor that sells primarily to systems integrators, already offers a product that allows packet-switched and circuit-switched transmissions to be multiplexed over a 64K bit/sec. line, a company spokesman said. While the company does not yet provide support of 1.5M bit/sec. T1 rates on its own, the spokesman hinted that an announcement is forthcoming _ probably at the Communication Networks '89 conference in early February, one industry source predicted. By Elisabeth Horwitt, CW staff <<<>>> Title : Crosscomm Corp. has annou Author : CW Staff Source : CW Comm FileName: netcross Date : Jan 30, 1989 Text: Crosscomm Corp. has announced a fiber-optic adapter card designed to add fiber-optic network capability to any IBM Personal Computer XT, AT or compatible system. The FA1 provides both single- and dual-fiber capability that supports 62.5 to 100mm cable as well as a connector for thin Ethernet. The card uses a half-length PC bus card and is said to be transparent to existing network software. The FA1 costs $995. Crosscomm, P.O. Box 699, Marlboro, Mass. 01752. 508-481-4060. <<<>>> Title : Cabletron Systems, Inc. h Author : CW Staff Source : CW Comm FileName: netcable Date : Jan 30, 1989 Text: Cabletron Systems, Inc. has announced a multiport twisted-pair repeater designed to increase flexibility when connecting twisted-pair and coaxial cable Ethernet local-area network segments. The Model MR9000TPT is intended for small work groups running Ethernet over twisted-pair requiring connection to existing coaxial or fiber LANs and backbone networks, the vendor said. Up to eight twisted-pair segments can be connected via RJ-45 ports, and the product reportedly includes full IEEE 802.3 repeater functions. The MR9000TPT has a price tag of $2,895. Cabletron, P.O. Box 6257, Rochester, N.H. 03867. 603-332-9400. <<<>>> Title : Torus Systems, Inc. has a Author : CW Staff Source : CW Comm FileName: nettorus Date : Jan 30, 1989 Text: Torus Systems, Inc. has announced a Netware Integration Option for its Tapestry II LAN Manager product line. According to the vendor, the option will permit an unprecedented degree of integration between Novell, Inc. Netware servers and Microsoft Corp. OS/2 LAN Manager servers on the same network. The Netware Integration Option is offered as an add-on to Tapestry II LAN Manager and is priced at $295. Torus, 240 B Twin Dolphin Drive, Redwood City, Calif. 94065. 415-594-9336. <<<>>> Title : Tymnet gets Unix manageme Author : CW Staff Source : CW Comm FileName: netmcdon Date : Jan 30, 1989 Text: McDonnell Douglas Network Systems Co. in San Jose has unveiled an end-to-end Unix-based network management system for users of the company's Tymnet networks. According to the vendor, the system was designed to provide customers with a framework for future integration needs. A recent report by Framingham, Mass.-based International Data Corp. noted that 50% of network users have as many as three different network management systems installed, while another 32% use as many as six _ each designed to support only one vendor's line of equipment. The McDonnell Douglas product will operate in a variety of vendor environments, the company claims, and will accommodate the integration of other network management systems such as Netview, IBM's host-based offering. The product is built around a Sun Microsystems, Inc. server and access workstations and was designed to optimize resource usage while minimizing overhead. The system reportedly handles network access and routing; data collection; monitoring and control; performance and utilization; and configuration management. It also provides security and automated trouble ticket reporting. The system is scheduled for delivery late this year, and pricing will be between 10% and 20% of overall network costs, depending on the client's individual requirements and the existing hardware configuration. McDonnell Douglas Network Systems, 2560 N. First St., San Jose, Calif. 95131. 408-922-0250. <<<>>> Title : A software utility said t Author : CW Staff Source : CW Comm FileName: netnbs Date : Jan 30, 1989 Text: A software utility said to be capable of defeating security on all versions of Novell, Inc.'s Advanced Netware has been released by Network Business Systems. The Netcrack program requires access to the network file server and a backup copy of NET$OS.EXE. After using the product, the network will have only default users and default security but no passwords. Netcrack costs $99. Network Business Systems, Suite 15601, 1300 Woodhollow Road, Houston, Texas 77057. 713-781-9268. <<<>>> Title : Software that connects in Author : CW Staff Source : CW Comm FileName: netntxco Date : Jan 30, 1989 Text: Software that connects independent IBM Systems Network Architecture networks using the IBM 3737 Remote Channel-to-Channel Unit has been announced by NTX Communications Corp. Cross Network Facility/Channel-to-Channel is said to be a VTAM application running under IBM MVS/XA or MVS/370. It was designed to provide a transparent interface between several VTAM applications, including MVS/Bulk Data Transfer, JES2/NJE, CICS and IMS. Scheduled for delivery this quarter, the product is priced at $36,000. NTX, 508 Tasman Drive, Sunnyvale, Calif. 94089. 408-747-1444. <<<>>> Title : Simware, Inc. has release Author : CW Staff Source : CW Comm FileName: netsimwa Date : Jan 30, 1989 Text: Simware, Inc. has released Version 2.0 of Mac3270, the company's connectivity product for Apple Computer, Inc. Macintoshes and IBM mainframes Version 2.0 reportedly supports all popular 3270 emulation methods and provides error-free two-way Macintosh-to-mainframe file transfer capability across several communications paths. An asynchronous version is available for $250; the Master version of Mac3270 Version 2.0 is priced at $325. Simware, 20 Colonnade Road, Ottawa, Ont., Canada. K2E 7M6. 613-727-1779. <<<>>> Title : Forest Computer has unvei Author : CW Staff Source : CW Comm FileName: netfores Date : Jan 30, 1989 Text: Forest Computer has unveiled the AS/400 Adapter, which is designed to enable Digital Equipment Corp. Decnet-based terminals to access IBM Application System/400s as full-screen IBM 3270 devices, and AS/400-based terminals to access Decnet as full-screen DEC VT220 units. Available now, the processor and AS/400 adapter cost from $35,000. Forest Computer, 1749 Hamilton Road, Okemos, Mich. 517-349-4700. <<<>>> Title : Wang Laboratories, Inc. h Author : CW Staff Source : CW Comm FileName: netwangl Date : Jan 30, 1989 Text: Wang Laboratories, Inc. has introduced Wang Office/Voice Mail, a voice messaging system that runs on Wang's VS computers. The product was designed to be integrated with VS Office Electronic Mail to form a single communications medium for sharing data, text, image and voice, the vendor said. The system consists of three components: Automated Attendant, Message Center and Voice Mail. Wang Office/Voice Mail is priced at $2,800 for a license. VS Office software is licensed separately and is priced from $5,000 to $18,000. Wang, One Industrial Ave., Lowell, Mass. 01851. 508-459-5000. <<<>>> Title : Data centers: Dropping wa Author : Patricia Cinelli Source : CW Comm FileName: dclead Date : Jan 30, 1989 Text: Data centers aren't what they used to be. Major alterations are taking place in these units as they begin to adapt to new realities and new expectations. At the beginning of this decade, the idea of isolated preserves of computing power made sense, says Vin Tomasulo, vice-president of the Chase Financial Services Data Center at Chase Manhattan Bank N.A. But since then, a lot has changed. At one time, hardware was unstable and had to be protected, Tomasulo explains. Now it is robust and powerful and designed for exploitation. Previously, the goal was to develop systems. Now and in the future, the emphasis will be to develop systems rapidly. Work used to be done in batch mode. Now, the predominant orientation is on-line. Fragmented telecommunications has started to coalesce, creating the outline of a unified information environment that must not only be managed but also leveraged for organizational benefit. Revisions ahead The net effect of all these changes, Tomasulo and other information systems executives agree, has been a rewriting of both job descriptions and data center procedures. In the past, the data center operated at the survival level, with all things being done to maintain its mere existence, Tomasulo states. These days, data centers are geared toward servicing and supporting the user or the business customer. ``You have a new set of objectives when you don't have to worry about computers breaking down,'' he says. ``Now, the focus of the data center is delivering information to customers _ what they want and when they want it, because today's businesses look to their data centers as the source of a competitive edge,'' Tomasulo adds. That change in expectations, says Ron Brzezinski, vice-president of information systems at Quaker Oats Co. in Chicago, has transformed what used to be a backroom operation into ``the hub of the organization.'' The proliferation of technology outside the computer room at Quaker Oats has not diminished the work of the computer center, Brzezinski says. On the contrary, it has added new layers of responsibility. ``We have a fiber-optic backbone network strung throughout our building,'' he says. ``We have LANs and numerous interconnected desktop computers. This web of technology has to be managed just like the mainframe.'' Furthermore, management of the expanding information systems web is not simply a matter of mechanics. Data center personnel are not just being asked to understand, connect, maintain, create and secure new kinds of systems; they must also adjust to entirely new quantities and types of demands. Once restricted and purely technical areas, data centers are now open to business units in multiple ways. ``The same manager who used to control the `off-limits' data center now manages the company's technological network,'' Brzezinski says. ``He now has continual contact with business needs and clients' applications in his day-to-day operations.'' Tough questions And, he adds, because technology has become a familiar and integral aid to decision making, the requests and questions received are not as easy to solve as they once were. Once a typical user question fielded by the Quaker Oats data center's Help desk would have been something like ``Can my report be out on time?'' These days, staff are hearing questions such as, ``I can't access my information. What's wrong?'' Brzezinski says the Help desk has effectively become ``the first line of problem resolution for the entire company.'' This heightened role is reflected in both a name change _ Quaker Oats has rechristened its Help desk ``the command center'' _ and in staffing level. ``Two years ago,'' Brzezinski says, ``we had one person on the Help desk. Today, we have six.'' Not only on Help desks, but throughout the entire data center structure, a reorientation is taking place, from technical administration to customer service. At Chase Manhattan, for example, Tomasulo says that the articulated goals for the organization's 110 data centers worldwide are dissemination of information to the business departments and close collaboration with business department users, or ``customers.'' There are a number of ways that those goals translate into actions. Applications are being designed to meet customer requirements, Tomasulo says, but that is only the beginning. Job functions within the data centers are also being altered to enhance user satisfaction. One instance of this, he says, is a new functional concentration on incident management, which goes beyond simply identifying and treating system problems to locate and rectify their root causes. Effecting this kind of change in the orientation of a data center is not easy. ``lt takes a while,'' Tomasulo observes,``to master the dynamics.'' And trying to do so without finding some way to lessen the already existing technical load is about as tricky as trying to execute a U-turn in a speeding vehicle. Even without the added responsibilities of network maintenance and substantive user support, operations staff have their hands full handling increased message volume and keeping up system service levels in an on-line environment. Selling like hotcakes That is why software products designed to automate a variety of chores performed within the data center have recently become a high priority on many managers' shopping lists. Tape library systems, job scheduling systems and automated documentation systems, which have been available for a number of years, are selling better than ever. And newer categories of automated operations products, such as performance monitors, problem change management systems, chargeback systems, report distribution systems, automated balancing systems and console management systems, are finding an eager market. ``Recently we've seen a big push toward automated console management systems, which alleviate the need for operator or manual intervention,'' says Neal Ater, vice-president of research and development at Goal Systems International, Inc., an international company that makes software for the IBM mainframe. The most basic level of console automation, according to Ater, is message-reduction software that ``sits between the console and the message and throws away messages you don't need to see,'' he says. The next phase is a software package developed by the company that can automate standard responses or repetitive actions. Just the first level represents a major assist for operators, according to Tomasulo. At one time, an operator's main task was to defend himself against the onslaught of calls, he says. Using products that screen spurious messages allows him to concentrate on more substantial chores. ``The environment became so complex that we had to have programs that could identify and resolve problems while we were driving down the highway at 60 mph,'' adds Gary Kirkham, a consultant with Forecasting Planning Associates, headquartered in Dallas. With partial automation, operations staff are freed from some of the routine chores such as scheduling, tape management or console monitoring and given increased latitude for new responsibilities. Guardian Life Insurance Co. of America in New York can attest to the efficiencies that a change in emphasis produces. Thanks in large part to its adoption of several automation products, the insurance company found it could merge its New York data center with its Bethlehem, Pa., operation and also absorb additional work without hiring additional people. The merger occurred just before Thanksgiving last year. ``Except for a skeleton crew, the New York staff was redistributed and assigned other functions in data processing,'' says Alex Polohovich, systems programmer. Although Polohovich did not formally plan for the consolidation, he says that the transition was a smooth one, partially because Guardian had begun automating functions within the data center several years ago. The organization had been gearing up for such a change, he explains, by using automated products such as Computer Associates International, Inc.'s CA7, an automated scheduling package; CA's CA1, an automated tape management package; and IBM cartridge loaders for tapes. The most recent addition to the mix was Candle Corp.'s AF/Operator, a package that catches all commands issued to the system, takes action, tries to correct errors and contacts operators immediately to prevent backlogs. Using AF/Operator saved the Guardian data center from hiring about seven extra people, he claims. The real motives Polohovich stresses, however, that the merging of data centers was not motivated only by budgetary considerations. Cost cutting was definitely a factor, he says, but Guardian was also trying to improve the speed and the quality of service it provided its users. Data center managers must now communicate to their staffs the message that automation and functional changes in the data center are both necessary and well-intentioned. In most cases, however, total automation and wholesale layoffs are not the real purpose. Brzezinski, for example, is careful to point out that the moves taken toward data center automation at Quaker Oats are not necessarily a prelude to lights-out status. The objective, he says, is to expand management responsibilities throughout the organization and to integrate the technology. Even so, he admits this may not be an easy adaptation for either data center staffs or managers. ``It's tough because you are taking people who have been isolated [in the organization] and making them a part of the business,'' Brzezinski says. But a convincing case can be made that revamping data centers and the automation of some manual functions creates a more rewarding job path for most data center personnel. Forecasting Planning's Kirkham, for example, predicts that some operators will move into systems programmer positions. ``With the boom in micros,'' he adds, ``those not attuned to programming can move into micro maintenance and wire management.'' Time on their side At the Washington, D.C., data center of the American Association of Retired Persons, Center Manager Ed Hopkins says that automation products have definitely changed the jobs of his operators for the better. ``Operators have more time to do things like monitor system performance, do capacity planning and work with users, a task that has gradually moved out of the director's realm and into the realm of the operators,'' he notes. And, at Perpetual Savings Bank, FSB, in Alexandria, Va., Ross Markley, data center manager, says CA's CA-Scheduler package , which took over all the job scheduling on the bank's IBM 3090, served as a catalyst for job advancement. ``Now a scheduler is sent to school,'' he says, ``and we have changed from basically a clerical staff to a more technical one.'' One of the benefits of automation is that it allows the bank to grow without adding to the data center staff, Markley says. Since he joined Perpetual about 2 years ago at a point when the bank was just beginning to install automated products, Markley says the data center work load has increased about 150% to 200%, but the staff has not increased at all. Now, he says, ``A bigger portion of our budget is going to automated products, which is more cost-effective than paying for additional staff.'' The real savings, however, are in not hiring new workers, rather than eliminating existing personnel, he says. There is also a portion of the budget set aside for retraining and continuing education of data center staff. Markley says he fully expects that the data centers he and other managers will be overseeing in a few years will be at least qualitatively different from what has existed. Adapting to survive In order for data centers to survive, he says, they will have to be automated and well controlled in functions such as scheduling, report distribution, performance and capacity planning. They will also have to process information quickly and accurately. ``In the past, we could see the need for an upgrading of a CPU coming years in advance,'' Markley reports. ``Now and in the future, we will have to watch month by month and be able to react very quickly'' to the situation. Furthermore, Markley adds, ``People in the network side of data centers need to become more refined in their public relations ability and in their knowledge of how PCs interact with the mainframe. The big focus at Perpetual and eventually everywhere will be on security of data.'' Data center managers are also due for a change, according to Quaker Oats' Brzezinski. Their role, he says, is changing to something that might be more accurately described as a technology center manager. And the requirements for that type of position are quite different. It is a job that will require both strong technology awareness and a business applications perspective. ``Before, we were satisfied with someone who could enforce schedules and manage vendors, but in the future, that will not be good enough,'' he says. By Patricia Cinelli; Cinelli is a free-lance writer based in Washington, D.C. <<<>>> Title : A case of shifting priori Author : CW Staff Source : CW Comm FileName: dcbox1 Date : Jan 30, 1989 Text: What follows is a picture of how an insurance company has already altered its MIS staff distribution and how it sees that distribution shaping up 10 years from now. The company, studied by Forecasting Planning Associates, is described as a mature firm with an effective management control system, strong internal controls and moderate growth. In 1978, most of the staff allocations were applied to applications development, with computer operations taking second place and data entry third. Only a negligible 3% of the staff was allocated to support end-user computing that year. By 1988, user-related activities had risen substantially on the staffing allocation agenda, while the percentage of the computer operations staffing was only half of the 1978 allocation level and data entry staffing was down to only 1%. The company is forecasting even greater changes for 1998, ones that will further deemphasize staffing for traditional data center functions for the following reasons: Intel Corp. 80486/586 chip technology will dominate the market. Computer-aided software engineering will mature. Commercial software offerings will continue to improve. Automated operations will be tightly integrated with the mainframe operating system. Bar coding and optical scanning will eliminate data entry. Optical-disk technology will be used for storage. Automated network linkage will be in place. <<<>>> Title : Data center managers tack Author : Joanne Kelleher Source : CW Comm FileName: dcintv Date : Jan 30, 1989 Text: Leonard Eckhaus is president of the Association for Computer Operations Management (AFCOM), a group of operations management professionals primarily working in large-scale data centers. AFCOM's mission is to help its members respond to changes in the data processing environment that introduce new management challenges. Eckhaus spoke recently with Computerworld Senior Editor Joanne Kelleher about how automation is altering data center operations. What do you consider the major management challenges facing operations managers right now? Automation is the single biggest thing coming and will have the most impact of anything going on today. The changes, in terms of operations, are going to be dramatic. End-user computing, where inputting data, printing reports and so on are being passed on to the user, is reducing staff in the operation. Operations managers are going to have to deal with people problems because there will be an elimination of jobs. Some of that is already happening isn't it? Yes. Most data centers have either totally eliminated the data entry department or else reduced it to just a few people working on miscellaneous things that come in. It is also evident that there is going to be an elimination of most or all of the computer operators in terms of the traditional work they now do such as mounting tapes, pulling forms from the printer or responding to console messages. Do you have any sense of where these people are going, how they are being used? In a lot of cases, the jobs are just disappearing and the people aren't being moved elsewhere. There are other areas in operations that some data entry people can go into. They can function, for instance, as data control clerks. But for the most part, the people are just being let go. Data entry departments are on their way out. I take it that you think that there are better alternatives. I think that if automation is going to work, even the people whose jobs are going to be phased out have to somehow be assured that they will be retrained for better jobs in the data center or that they will wind up working in a user department in which they will be viewed as experts. What do you see as some of the new roles for staff? What kinds of jobs will they be doing? What is going to come in is an addition of higher level technical analyst positions in which the job will be to monitor system activity and react to conditions that the automated systems bring to their attention. We'll also see things like performance measurement analysts, systems software analysts and programmer analysts in the operation. Now, these positions obviously already exist in areas outside the operation. But when we get close to unattended operations, these positions will have to be in the operations area. A little earlier, you mentioned the idea of personnel actually going out into the end-user departments. Do you, in fact, expect that there will be staff attached to the data center or to the operations area who will serve as sort of resident consultants in the business departments? I'm sure there will be, although what I was referring to before is that end-user departments are going to want to have some people with DP backgrounds working for them. I think we'll see both kinds of arrangements. It is likely that there will be people in the data center who will spend all their time interfacing with the end-user departments, training them, helping to get service on their equipment and so forth. But I also think that some of the [displaced] people will end up working in the user departments. Do you have a sense of the likely time frame for widespread automation? Well, people talk about automation and they talk about unattended or lights-out operations. There is really a big difference. Today, unattended is still a concept. Almost nobody is truly running unattended. Automation, on the other hand, is something that is here. Virtually all large-scale data centers have some degree of automation; they have a tape library system or a scheduling system, or they have two or three or four or five different software systems. Are operations managers taking charge of automation efforts, or are they letting the vendors set the pace and the direction? Up until now, it has been more the vendors taking the lead. But now, operations managers are realizing that this is the wave of the future, and they must take control of the situation and be involved in actually coming up with a plan to get automated. What caused this change of heart? Operations managers are realizing that they need to be automated just to keep up with today's computer equipment. For example, with the internal speed of today's computers, the number of messages that are generated on a console can become impossible for an operator to handle. So console management that will answer standard console messages becomes almost a necessity if you want to keep the system running efficiently. Are there any other areas in which the pressures are becoming more than can be humanly handled? Maybe not more than can be humanly handled, but you have to remember that every time there is a human handling, it takes time. You've got a CPU sitting there and if it waits one second, that second on some CPUs could have been used to perform several million additions. Where things are automated, you gain all of that time back. Is there any top-down pressure being applied by, say, the head of MIS? It may come from the head of MIS but certainly no higher. In most cases, it goes the other way. The operations manager who wants to do more automating is having to justify it and go in and say, ``I'm going to be able to save dollars in terms of head count'' or ``I'm going to be able to service the user much better.'' The way that this is often discussed is as returning management to the end user through automation. The question is, if you return management to the end user, exactly what does that leave in terms of data center or operations duties? What we are really talking about is returning some of the control to end users. There are some things they are doing now that used to be done in the computer center, like inputting data, getting output directly back. In some installations, users can, within certain limits, even schedule their own work in terms of when it is going to run. They're not doing the job of running the systems or of maintaining the hardware, but they are going to have more control over their own work. Is increased automation the only major change affecting data center operations? How about downsizing? Over the last few years, we've begun to see downsizing in some medium-size data centers as they go from mainframes to minis. It is still not a major thing, but it is happening in small numbers. When you go from large mainframes to large mini systems, some of the traditional operations stuff is no longer necessary. Maintenance is different. The kinds of operating systems you are using are different. It changes the name of the game for operations. Is there any possibility that, to a certain extent, the willingness to entertain the idea of automating operations results from seeing it as an alternative to wholesale downsizing? I'm not sure. I don't really think so. Downsizing is a term that means different things to different people. You could replace one large mainframe computer system with 10 minicomputer systems. In that case, would you really be downsizing or are you creating more problems and more areas to get involved in? How about downsizing in the sense of certain functions moving out into the departments on either departmental minis or local-area networks? What impact is that having? One of the things that happens is that some of the budgets for hardware wind up going into the end-user departments. Also, the end users are getting more sophisticated, so they want more input on what kind of hardware is going to be purchased for their use. So the data center becomes much more of a service area, and the job becomes more one of servicing requests than of dictating what can or can't be done. This may be the way it should always have been, but it is only recently, I think, that people in data processing have been realizing that they really are a service adjunct and not the company's main business. Do you see any of these changes affecting the data center's relation to other parts of the MIS organization? Yes. One way is in terms of status. In the past, operations managers have been much more managers of people and large budgets than they have been managers of technology. It is strange because it is a reversal of the way the rest of the world operates, but, in a technical field, that really becomes a hindrance in terms of professional recognition. In a technical area, when you look at managers in terms of status, the more technical the people in the area, the more recognition [the area] is given. Through automation and the elimination of the lower level jobs and the growth of the higher technical-level jobs, these managers are starting to be looked at as managing technology. So there are some real benefits in automation for the operations manager? Oh, I think there are a lot of benefits. They're not going to be fighting wars all the time trying to explain why things went wrong. There won't be as many day-to-day problems coming up in the operation in terms of production. What should the wise operations manager be doing now to guide the automation process and make sure that it is on track? I think the wise operations manager is looking at his data center as it stands and the applications that are being run today as well as where that data center is going _ what is going to be happening to the work load, what new applications are coming up _ and developing a plan to get automation installed. There really has to be a plan for automation because, in many cases, operations managers are going to have to justify the up-front costs that are involved. <<<>>> Title : The writing on the data c Author : CW Staff Source : CW Comm FileName: dcquotes Date : Jan 30, 1989 Text: It is widely acknowledged that data centers are entering a period of considerable change. In the interest of getting a better reading on the force and direction of this phenomenon, Computerworld spoke with a number of information systems managers working at or near the epicenter about changes that have already taken place in their organizations' data centers and what further developments are expected. Here and now: ``In our own corporate center, we're going to unmanned rooms, where your CPUs, your controllers, your jazzy devices are being isolated away from the mainstream of activity.'' Around the corner: ``The next move is PCs totally being capable of interfacing with the host. Some of them are becoming so powerful, you'd think they're little hosts in their own right, right now. ``In some cases, you're probably going to see some shrinkage in staffing of corporate data centers because of the database structures that are becoming more and more prevalent. Data is going to be much more readily accessible from remote areas, even by way of PCs.'' Here and now: ``There will always be large, central data centers. I don't think there's going to be any major revolutions in the next five years. There will be larger and faster mainframes and more sophisticated software. How they're managed will be important and the technical expertise to manage them is going to be critical. I think operations will evolve into a more challenging role than it is today.'' Around the corner: ``If the company doesn't require or dictate that you're a profit center, then profit-center-type management practices will need to be in place. Using such techniques keeps you competitive. You're going to carefully select vendors and products to keep expenses down, not unlike if you were offering an insurance product to someone. It's like running a business.'' Here and now: ``One popular buzzword is automated operations or some variation on that. In our data center, I see that as having a very, very gradual occurrence . . . getting ourselves into a position where we can run more jobs unattended as opposed to a lot of operator interface.'' Around the corner: ``I think operations managers are going to have to learn more about voice communications because there seems to be some tendency to combine voice and data communication under one area, and frequently that's going to come under operations.'' <<<>>> Title : Care and feeding of speci Author : Peter Berkeley Source : CW Comm FileName: berk Date : Jan 30, 1989 Text: Like it or not, the role of data centers is changing, from one of manual-intensive operations to one of support and service delivery. Where once sequential tape drivers dominated the computer room landscape, there are rows of mass storage direct-access devices, which make data storage and retrieval much more expedient. As a result, tape operators and tape librarians are, for the most part, on the decline. Data entry personnel, print operators, bursters and de-collators are suffering similar fates, as more data centers transfer responsibility for both input and output back to users. Although the value of such labor-intensive data center functions is declining, data center staff have not reached a dead end. The transformation of organizational computing is opening up new opportunities for retooling the skills of existing staff. If they haven't already, data center personnel will soon be asked to perform new kinds of functions demanding new kinds of skills. They will be asked to perform as highly skilled knowledge worker professionals. Users demand more Users are coming to expect and demand higher levels of service, and most of the activities of data center staffs in the future will be geared toward meeting those demands. Typical functions will include monitoring and maintaining service delivery levels; providing appropriate user interfaces; managing physical databases and communications environments; and training users in the use of hardware and automated end-user-computing tools. The way these functions sort themselves out will be roughly as follows: Computer Operations, as we know it today, will become highly specialized, with functions that relate specifically to direct operation of computing resources. Direct Operations will continue to be the core of the data center, with responsibility for hands-on operational activities such as console operations, tape handling, I/O, job scheduling and micrographics. Maintaining the operational capability of Direct Operations will be the job of Operations Support. This area will perform many of the activities that now tend to distract Direct Operations from its main mission _ network control, physical database management, resource accounting and configuration management, user Help Desk services, performance change and problem management. Technical Services will continue to focus on installing and applying fixes but will assume added responsibilities. These will include jobs such as management of installed database and teleprocessing environments, network support and access methods and support for the growing base of end-user software resident in both the data center and Information Centers. Management of the data center will also evolve to include a critical planning component, with far-reaching implications for information technology usage across organizations. This function, if it is given the proper support and authority, should be invaluable in ensuring the alignment of computing resources with business goals and directions. Ultimately, the Management and Planning area will be responsible for transforming the data center into a provider of computing power and service to organizations. Finally, there must be an Administrative Services Group to provide the largely generic administrative and clerical functions that will be required _ data center administration, vendor relationships, documentation libraries and so on. Human resources lacking Unfortunately, not many companies currently have adequate formalized human resources management programs for the data center. And, if they are not equipped to meet current needs, they certainly will not be prepared to tackle the much larger planning issue of minimizing the negative impacts of change and transformation on this scale. Information systems managers must take a number of actions to stave off long-term problems in data centers. These include the following tasks: Providing the right environment and tools to foster rapid staff development. Formulating a clear definition of how job functions will change and what skills will be needed to perform the new tasks. Developing an aggressive and focused training program that will make it possible for data center staff to acquire the skills they need to remain viable and contributing members of the organization. Creating a succession management program that will both protect the center against disruption in critical service functions and serve as a baseline for the development of clear career tracks for data center staff. With functions in the computer center becoming highly specialized, it will no longer be possible to shuffle staff in an arbitrary fashion. The evolution of computer centers and organizational computing will stall if trained computer personnel are not on board and adequately equipped. By Peter Berkeley; Berkeley is director of education products and services at Operations Technology Corp., an MIS consulting firm in Southboro, Mass. <<<>>> Title : Data centers face relativ Author : Gary Robins Source : CW Comm FileName: robsidtx Date : Jan 30, 1989 Text: Data center budgets will continue to move in an upward direction, but total corporate spending on technology, both inside and outside the MIS organization, will increase at a greater rate. That is the projection of Douglas Brockway, a principle at Nolan, Norton & Co. in Lexington, Mass., who maintains a database of personnel, equipment and software costs in more than 200 large-scale data centers. According to Brockway, there are two reasons for the slowing of data center outlays relative to corporate expenditures on personal, interactive and networked technologies. First, the unit cost of the total amount of work being done by data centers is declining. Brockway, who defines the unit cost of work in terms of units of consumed processor power, or millions of instruction per second, says that unit costs within his sample are dropping at an average of 20% a year; the decrease is estimated at about 10% to 15% a year for larger data centers. A second factor is that even greater economies are being achieved with personnel in data centers. ``The head count required to support a unit of work in the data center is going down even faster than unit costs,'' Brockway observes. A smaller increase is a far cry from an actual drop, however. Even though the boom may have passed, in 98% of the data centers that Brockway follows, ``budgets are still rising in raw dollars'' and he does not expect the figures to suddenly turn negative. One stabilizing factor is that the whirlwind activity going on outside the data center actually creates additional work inside that area. Users must be connected to mainframes; files must be accessed, supplied and maintained. ``I think of the data center as providing a product to a marketplace,'' Brockway says, ``and the marketplace demand keeps going up.'' Local alternatives Why bring new work to the data center when local alternatives exist? According to Brockway, as the unit costs drop, the data center becomes a more attractive alternative for users. Rather than run an application on a departmental system, users find the cost of data center computing increasingly less expensive. There is also a clear trend to use the data center as an ``unlimited'' repository for data. As the market grows, the effect is to accelerate system turnover in data centers as applications start to require significantly more virtual storage, relational capabilities and communications features. Experts say data centers will be spending the most on software and service during the next few years. Perry Harris, director of information systems at The Yankee Group in Boston, for example, expects a dramatic shift from investment in iron to investment in software and support. This will be especially true within the IBM arena, he says, because IBM will be pressing to derive a higher percentage of its income from software revenue, including fees for maintenance, system software support and the actual monthly cost of that system software. IBM is also expected to increase both the number of software offerings and the bundling of those products, Harris says. ``What the end user will see in his budget will be a stabilization or low percentage rate growth of expense for hardware but a much higher rate of growth on the software side,'' he says. Arnold Farber, president of Farber/ LaChance, Inc. in Richmond, Va., says he feels that part of the increase in software expenses will be directed at the effort to pare salary costs through automation. While data center equipment costs have come down, the cost of salaries has conspicuously increased, Farber observes, and that makes staff reduction through automation a very attractive idea for many organizations. Farber estimates that, so far, only 10% of the total market has embraced unattended operations as a goal, but he predicts that interest in automation will continue to gain momentum during the next five years. One thing that will spur interest, Farber says, is the emerging concept of trying to turn data centers into profit-making operations. By Gary Robins; Robins is a free-lance writer based in Northfield, Minn. <<<>>> Title : Tracking IBM's gateways Author : Zak Kong Source : CW Comm FileName: kong1 Date : Jan 30, 1989 Text: For MIS managers implementing LAN-to-host gateways, predicting which of IBM's major gateway directions will emerge as the industry standard is crucial. Will IBM's future product strategies emphasize one local-area network gateway alternative over another? Or will the industry leader continue to offer the two solutions it does today? The answer lies in IBM's recent announcement and current promise of several new products, which should greatly affect the future direction of the IBM Token-Ring LAN as well as its connectivity to IBM's Systems Network Architecture (SNA) host resources. In a November rollout, IBM announced a high-speed 16M bit/ sec. version of the Token-Ring network. And expected soon is a new model of the IBM 3174 cluster controller featuring enhanced Token-Ring gateway capabilities. On the surface, these product enhancements indicate that IBM's preferred Token-Ring gateway solution will center around the 3174, which has emerged as a very strategic product for IBM, especially in the area of Token-Ring gateways. But IBM also offers a capable Token-Ring gateway implementation that uses a PC as the link. And the vast majority of third-party Token-Ring gateways on today's market use a PC, not a 3174, as the gateway. Same old story Tracking IBM's product directions has always been difficult. The company never seems to offer just one way to solve a particular application; multiple solutions are a way of life at IBM. In fact, market research there sometimes means announcing several products and then seeing which alternative _ or alternatives _ are embraced by the marketplace. Products that do not catch on are simply downplayed or, in some cases, discontinued altogether. Witness the PCjr. Or consider IBM's first LAN for personal computers _ a broadband network that utilized a bus topology. Called the PC Network, it was not a huge success by IBM standards. IBM's first serious LAN offering, most analysts agree, was its Token-Ring network. The Token-Ring incorporates a baseband ring topology and a token-passing access method. At the time of its announcement, Token-Ring was a radical departure from the then-reigning de facto standard, Ethernet, which utilizes a bus topology and the carrier-sense multiple ac cess with collision detection (CSMA/CD) access method. Today, however, the Token-Ring has emerged as the LAN of choice for PC-oriented LAN implementations. According to La Jolla, Calif., market research firm Computer Intelligence, Token-Ring has a larger installed base than Ethernet in those installations interconnecting PCs only. In this segment, Token-Ring has a 38% share while Ethernet has a 35% share. Clearly, then, determining the gateway choice for a Token-Ring network is extremely important. A LAN gateway consists of a hardware/ software module attached to the LAN that provides a shared communications path to the host mainframe. All nodes on the network can use the gateway to communicate with the host. This provides a cost-effective solution, because each node does not have to have its own separate communications link. Of IBM's two current solutions for Token-Ring host gateway connections, the first approach uses a designated PC on the Token-Ring, which functions as the gateway. This ``gateway PC'' communicates with the other ``workstation PCs'' via IBM's Network Basic I/O System (Netbios) interface. The second approach, in addition to using the 3174 controller, can also use a number of other hardware products _ for example, a 3725 or 3745 front-end processor or an IBM Application System/400 _ as the physical gateway. In these scenarios, the IEEE 802.2 protocol is used in place of Netbios. While quite different in implementation, both of these gateway approaches rely on IBM's PC 3270 Version 3 emulation software to achieve 3270 communications functionality. The software is just configured differently, depending on the gateway. To ensure future compatibility, however, a clear gateway choice must emerge. Before addressing that issue, it is important to take a close look at the standards and technologies involved. The Institute of Electrical and Electronics Engineers, Inc.'s 802 committee has produced a number of standards relating to LANs: 802.1, 802.2, 802.3, 802.4 and 802.5. The 802.1 and 802.2 standards are high-level layers that apply to all 802 implementations. The 802.3 through 802.5 layers are lower level and define the LAN media-access method. The 802.3 standard defines Ethernet (CSMA/CD); 802.4 defines Token-Bus and is used almost exclusively in manufacturing environments as defined by the Manufacturing Application Protocol standard; and 802.5 defines Token-Ring. Another industry standard that is very important to the PC-based LAN marketplace, although not an 802 standard, is Netbios. It was developed as an extension to the IBM PC's BIOS firmware. In a sense, Netbios can be viewed as a high-level alternative to the 802.1 layer. Netbios has emerged as a de facto standard for PC-only LANs. It is supported by many third-party LAN vendors on LANs other than IBM's Token-Ring. It has allowed a wide variety of LAN-based applications such as host communications gateways and database systems to emerge. And although IBM's OS/2 Extended Edition and LU6.2 will probably emerge as the new standards in this area, Netbios-based applications will probably continue to play a major role. In fact, the OS/2 LAN Manager from Microsoft Corp. will be fully compatible with Netbios. The gateway alternatives The first gateway approach IBM offers uses a PC as the gateway, in which a designated PC on the Token-Ring acts as the gateway. It is equipped with a Token-Ring adapter board on the downstream link and a separate communications board for the upstream host connection. This methodology is shown in the top half of the chart above. The Netbios interface is used for communications between the gateway PC and the workstation PCs. With such a gateway PC approach, only the gateway PC itself is recognized as an SNA Physical Unit (PU). Each of the workstation PCs on the Token-Ring is designated as a Logical Unit (LU) only, not a PU. Hence, it can be said that SNA stops at the gateway PC. This type of PC gateway is the most prevalent on the market today. In addition to IBM, several third-party vendors have introduced communications gateways of this type. The PC gateway approach is typically used with smaller Token-Ring installations that interconnect PCs exclusively. On the higher end are larger Token-Ring installations that interconnect a wide variety of IBM equipment _ for example, PCs, minicomputers and controllers _ encompassing hundreds or thousands of nodes. In this case, IBM's second approach often comes into play. This technique literally extends SNA onto the Token-Ring down to the individual LAN node level. Here each communications workstation is designated as a separate PU. The second approach does not use a PC as the physical gateway. Rather, it involves directly attaching the Token-Ring to one of the following IBM hardware products: a 3174 cluster controller, a 3725 or 3745 front-end processor or an AS/400 or 9370 mid-size computer. IBM's Token-Ring Interface Card (TIC) is used to achieve the physical connection to the Token-Ring. The lower half of the chart above illustrates the second type of gateway, in which an IBM 3174 functions as the physical gateway link to the host. To propagate SNA all the way across this Token-Ring, a much finer level of control is required than can be exercised via Netbios. Therefore, the lower level 802.2 protocol is used. Netbios is a higher level specification than 802.2 and as such is more user-friendly and provides global-type functions. Therefore, it has emerged as the implementation of choice for most third-party gateway suppliers. IEEE 802.2, on the other hand, is lower level, which means it is more complicated to work with but provides a much more detailed level of control. But the two standards are not mutually exclusive. Simultaneous 802.2 and Netbios communications can occur on the same Token-Ring, although the gateway would still be different. Having clarified IBM's two LAN gateway approaches, it is also important to understand how three key IBM concepts _ the 3174 controller, the Systems Application Architecture (SAA) and Netview _ fit into the picture. The 3174 controller The 3174 cluster controller has emerged as a strategic product for IBM, not only for 3270 coaxial applications, but also for the Token-Ring gateway. Of all the hardware that can be used for an 802.2 Token-Ring gateway, the 3174 is the least expensive and therefore will perhaps emerge as the most used product for 802.2 gateways. From a connectivity standpoint, the 3174 has two sides: the upstream connection and the downstream connection. The upstream connection can be a host channel attachment, a Synchronous Data Link Control host link or another Token-Ring. Assuming that the downstream connection is to a Token-Ring, then an upstream connection to a host makes the 3174 act as a LAN-to-host gateway. An upstream connection to another Token-Ring makes the 3174 act as a LAN-to-LAN bridge. The downstream connection can be coaxial or Token-Ring. Coax allows the 3174 to function in its historical 3270 coaxial controller mode. Token-Ring is achieved through the use of the TIC. For downstream connections, the 3174 can retain all its protocol implementation functionality within itself, or else some of the control can be delegated to downstream devices. Delegating protocol implementation in this fashion is made possible by a facility called Distributed Function Terminal (DFT). With DFT, attached PCs can communicate with a host using the 3270 protocol and a variety of other SNA protocols. Once the controller goes on-line with the host, all SNA request units are passed through the passive controller and then processed by the terminal device. The DFT acknowledges that the terminal device has built-in intelligence. DFT allows PCs to communicate via advanced protocols such as LU6.2/Advanced Program-to-Program Communications (APPC) by using newly available software. It allows multiple concurrent host windows. It also opens up the controller link to accommodate other future directions, because DFT assigns the workstation complete control over what data-stream control characters it can handle. Note that the facility allows the workstation, not the 3174, to determine the LU type. This means that workstations can be attached to communicate in ways never originally designed into the older 3274 controller, the 3174's predecessor. The latest 3174 speculation is that the new model will be DFT-only. SAA In creating SNA, IBM established a common blueprint for host communications, defining how different types of users and computing equipment would communicate. In 1986, IBM set an even bigger goal. Faced with competitors like Digital Equipment Corp. that had managed to keep their computer lines relatively applications compatible, IBM began publishing its SAA, a set of generic specifications designed to define not only computer communications but all computer applications. At the core of SAA is the premise that software should be portable from one hardware environment to another without modification. Ultimately _ although it is far from implementation _ a program for an IBM 3080- or 3090-class mainframe should run on a Personal System/2 micro and vice versa. SAA's specifications also define a new level of advanced dispersed data processing, namely, cooperative processing. Based on the much publicized LU6.2/APPC, cooperative processing distributes processing loads among different-sized computers within the network, while allowing the systems to talk to each other directly at the operating system level without the need for terminal emulation. In terms of PC-to-host links, LU6.2/APPC includes the ability to design a very high level of integration between the PC and the host. A PC program can exchange information with the host by updating or extracting only those elements that are needed, thereby not necessarily requiring complete file transfers. This integration will greatly reduce line costs in many LAN-to-host applications. Netview Centralized network management has become an increasingly important concern among those responsible for the management and control of local- and wide-area networks. And while network management tools may not be critical considerations for those implementing a small, 20-node Token-Ring application, for example, they become imperative for those setups running large 802.2 Token-Ring networks that may span upwards of 4,000 to 5,000 nodes. Netview is IBM's strategic product for network management. Originally announced in May 1986, its first release simply consolidated several existing IBM host-based software tools for network monitoring. In September, IBM announced Version 3 of Netview, adding functions and making the software easier to use. Netview has now emerged as a powerful system for centrally controlling and managing multiple or interconnecting SNA networks. DP and communications managers use Netview for a variety of tasks, including tracking and controlling terminal usage, identifying and reporting hardware problems, testing modems and collecting data on specific network resources. Like SNA and Token-Ring, Netview is rapidly becoming a de facto standard within the industry. In a LAN environment, it is interesting to note that the product can only support Systems Services Control Point-to-PU sessions. This means that Netview can keep track of SNA PU devices only, not LUs. Therefore, in a Token-Ring using a PC as a gateway, Netview can only determine the physical location of the gateway PC, which represents the PU. If individual workstation PCs are moved with the Token-Ring, Netview has no way of tracking them because each node PC represents an LU, not a PU. In an 802.2 Token-Ring in which each PC has its own PU address, Netview provides complete control over each workstation PC. If a workstation PC is physically moved, Netview can determine its new location. This is important in large networks in which network management has the responsibility of keeping track of thousands of PCs. IBM's direction IBM derives most of its revenue from mainframe products, not PCs. Therefore, many industry observers agree that IBM's overriding product strategy has always been to overload the mainframe. By so doing, IBM is able to sell highly profitable mainframe enhancements. But how does this strategy translate to the LAN gateway marketplace? A case could easily be argued that IBM will drive the market toward 802.2 Token-Ring gateways using hardware like the 3174. With the centralized Netview control of an 802.2 Token-Ring, growing and expanding network applications is a more manageable undertaking. And the more applications you add, the more mainframe resources you consume, which implies an IBM strategy of overloading the mainframe and thereby requiring profitable host upgrades and enhancements. IBM clearly understands centralized control. So do corporate MIS managers who have long lived in the world of centralized control and tend to view the world from the host's point of view. PC vendors, on the other hand, tend to view the world from the PC side, in which decentralization seems more desirable. This scenario seems to indicate that while PC vendors attempt to drive the market toward PC-based Netbios gateway solutions, mainframe-oriented IBM would benefit by driving the market toward centrally controlled Token-Ring implementations with 802.2 gateways. Further, the previous version of IBM's Token-Ring ran at 4M bit/sec., while the new version will run at 16M bit/sec. It has been rumored that, if it is indeed upgraded, the new 3174 will support both the 4M bit/sec. and 16M bit/sec. Token-Ring networks and will therefore be able to act as a LAN bridge between the two, in addition to its LAN gateway functionality. The enhanced controller should also include more intelligence and larger memory capacities, allowing it to become a more complete SNA node, with added routing and network control functions. Thus, these product enhancements would seem to provide additional credence to the theory that IBM will be moving the market toward 802.2 Token-Ring gateway implementations involving distributed functionality, 3174 gateways and centralized Netview control. But that scenario may be wrong. Never underestimate the unpredictability of the Armonk giant. IBM may, in fact, push Netbios solutions. While waiting for a further sign, MIS and communications managers will have to base today's implementation decisions on the knowledge currently at hand. By Zak Kong; Kong is president of Network Software Associates, Inc., a micro-to-mainframe communications software producer based in Laguna Hills, Calif. <<<>>> Title : What more can MIS do? Author : James Young Source : CW Comm FileName: young1 Date : Jan 30, 1989 Text: Though we may have become used to them, the boundaries of MIS responsibility are by no means stationary. Traditional roles of analysis and programming are being challenged by non-MIS groups. There are also the well-publicized battlegrounds of end-user computing and departmental computing. These encounters are perceived as being of profound importance and are usually settled based on strategic issues or at least in a considered way. There are, however, other demarcation discussions that get less attention and are not blessed with such thoughtful evaluation. These usually involve companywide ancillary responsibilities for which MIS may provide a suitable home and can include the mail room, company print shop, telephones and office service duties. Pairing such bits and pieces with MIS can prove to be more important to MIS than we might at first think. On the posi-tive side, they can complement an MIS service operation, build economic and management synergies and exhibit MIS skills. On the other hand, they threaten to distract our attention, waste our energy and cast a menial reflection on the department. Therefore, despite the lack of traditional attention toward these areas, MIS should take an interest in the question of whether they belong under MIS. The organizationally overlooked This is not to imply that MIS is in a position to engineer the acquisition of additional duties. These activities tend to be small but have a tradition of being independent. Control of these functions is largely uncontested, and therefore they may be very comfortable in some organizational niche. Since they are small, the rest of the company perceives them as relatively unimportant. It is during these periods of customer dissatisfaction that realignment becomes popular. It is also when the firm considers the merits of MIS assuming new duties, and MIS must be ready to handle them when drafted. In addition, MIS must recognize and then fend off inappropriate assignments. This process of introspective analysis is a therapeutic exercise and may uncover affinities that MIS may have with selected areas. Usually, MIS will see no reason to take over other functions. MIS should get away from viewing isolated pockets of service activities as potential trophies in the MIS fiefdom and see them as entangling activities that we have no business trying to manage. With this jaundiced attitude, the few occasions calling for organizational cohabitation will stand out on their own merits. The following guiding principles may show whether it is wise to adopt any new duties through realignment: Priority. Any new duty must pass the importance test. Is the task critical enough to commandeer precious MIS time and resources? Is it more important than those things that MIS will now not be able to get to? Is this how top management would invest MIS resources? Integration. One obvious reason for MIS's assumption of an organizational responsibility is the merging of technologies. For example, when integration of voice and data communications makes functional or financial sense, telephone operations are likely to join MIS. This union allows one group to manage changes in technology and customer service. Boundary disputes are eliminated by eliminating the boundaries. Transition. As one technology supplants another, the transition can be smooth if it is all contained under one roof. Facsimile devices have gone from being mail-room equipment to group or personal devices and eventually will become just another facet of personal computing. If one group such as MIS were to oversee this maturation of a technical application, change could be supported selectively or even encouraged. MIS can be a better agent for change if it has the full spectrum of responsibilities over an application area such as copiers. Technical skills. Even when no change in technology is contemplated, MIS can bring skills to an organizational marriage when other groups are using technology that they have no tradition in handling. Word processing equipment is an example of technology that other groups tried to deploy. MIS not only can implement and maintain the hardware and software but can be more technically circumspect in selecting products. While MIS could perform a service and advisory role without organizational integration, keeping such activities all in the family is more efficient. Management. You know that we have come a long way when you cite our management skills as a reason to combine another group with MIS. Yet, we have also built considerable talents that would enhance any service operation. Over the years, MIS has developed a compulsion for reliable, quality service. Service-level negotiation with users is a discipline that MIS pioneered. Cost/benefit analysis, vendor management and cost containment are all activities with which MIS has extensive experience. While our ability to communicate with top executives could stand improvement, it would still be a positive addition to many disenfranchised units in our companies. Even functions seemingly unrelated to MIS _ such as copying facilities management, security and courier services _ could benefit from the pure service management talents of MIS. These strengths might make it sound as if MIS is the right organizational companion for everyone. Nothing is further from the truth. Organizational realignment with MIS must have compelling reasons before redirecting precious MIS time and attention to it. Beware of and shun the stupid reasons for combining units, such as ``no one else wants it'' or ``this will be organizationally convenient.'' MIS cannot be a home for outcasts and misfits and still perform its primary mission. If we can offer improvements and a move makes sense, then we should aggressively assume new responsibilities. But MIS should heed the admonition of signs that once hung in army mess halls: ``Take all you want, but eat all you take.'' By James Young; Young is managing director of MIS at the Wheeler Group, a division of Pitney Bowes, Inc. in Hartford, Conn. <<<>>> Title : Export's about-face Author : CW Staff Source : CW Comm FileName: profile Date : Jan 30, 1989 Text: Three years ago, the U.S. Department of Commerce was doing such a poor job of processing export licenses _ most took 46 days, and some applications were lost not once but three times in a row _ that critics said the job should be handed over to the Pentagon. ``But I'm a competitive son of a bitch, and I didn't want to give away something like that,'' says Lee Mercer, who was hired by the department as deputy undersecretary for export administration to revamp the operation and oversee automation of its reams of paperwork. What Mercer and John Young, director of information resources management (IRM) at the Bureau of Export Administration, did was to turn the department's black eye into a Gold Medal Award for several successful automation projects. Today, the department processes export licenses within five to 14 days, and none fall through the cracks. The bureau's job is to review business applications for export licenses, determine if they meet export-control regulations preventing the diversion of high-technology products to the Soviet bloc and issue the approved licenses as quickly as possible. It issues about 100,000 licenses a year, representing $100 billion in sales by U.S. companies, or roughly 35% of U.S. merchandise exports. Presenting the Gold Medal Award to the IRM staff in October, Commerce department executives praised the electronic-licensing system as the envy of U.S. trading partners _ Japan, Germany and the UK are copying the bureau's system _ and for helping to speed U.S. exports to foreign markets. Brickbat bashing That praise is a far cry from the brickbats hurled at the department a few years ago. ``Some people told me I shouldn't take the job,'' Mercer says, ``because the unit had such a poor reputation. And some people told me that John Young's group did not have the capacity to automate the system.'' Actually, in 1984, Young had drafted a visionary plan for computerizing the paper-intensive process, but he lacked the political clout to get it moving. A few years later, that clout was provided by Mercer, who was described by one bureau source as ``a tough guy in a situation that needed a tough guy.'' ``What I added to this equation is impetus and support from the top,'' Mercer says. ``Without the support of top management people, who have a view of where they want the organization to go, the IRM will run into the natural roadblocks of everybody defending their turf.'' For example, in order to forge agreements between the users and the IRM office, Mercer decided to chair the steering group and require users to send a representative to every meeting because key decisions would be made at these gatherings. No one claimed that automating the process of export licensing would be easy. Young says there were five different studies conducted from the early 1970s to 1983, and all concluded that it was impossible to automate 100% of the process. The key to success, Young and Mercer agreed, was to try to make incremental progress anyway.``Automating the entire process probably is impossible, but you can get 50% of it very quickly, then 70%, then 90%, with not a lot of expense or risk,'' Young says. The fundamental problem was that the bureau handled 20 million pieces of paper a year, Young says. Despite the inefficiencies, it was hard to wean the paper-handlers away from the paper. What did the trick, the IRM director says, was to provide a small group of users with workstations and give them lots of special attention. Once the others were able to see the value of the new gadgets, they wanted some, too. Rather than an organizationwide mandate, ``you need to get change agents out into that users group who will preach your philosophy,'' Young concludes. Driven designer Young also gives credit to IRM staff member Jack Floyd, who designed the computer system and was described as a man ``driven to fix the export licensing process.'' The result is the Export Control Automated Support System, an on-line system for licensing officers that runs on an IBM 3081. The bureau has also developed the following ancillary systems: The Electronic License Application and Information Network, which allows exporters to submit their applications, and receive approved licenses via value-added networks. The Licensing Officer Access System, providing access to 15 million records on past and present applications to help them make faster and more consistent licensing decisions. A 32-station, local-area network for office automation applications. System for Tracking Export License Applications, or STELA, a computerized voice-response system that allows exporters to check on the status of their license using a Touch-Tone telephone. Exporters tap in the number of the application and hear a synthesized voice that states exactly where an application is in the process and how long it has been there. STELA was a simple yet elegant response to a big political problem. Previously, exporters had so much trouble determining the status of their applications that they resorted to calling the man at the top, then-secretary Malcolm Baldrige, who in turn ordered his staff to find the answers. Desperately seeking export Baldrige reportedly got 20 calls a week from desperate exporters, Young says, recalling that former IBM Chairman John Opel called Baldrige every three weeks, and one executive chased Baldrige down during a European vacation. STELA was one of the ideas in Young's 1984 automation plan that had stalled. ``Lee Mercer got a lot of advice on STELA: `Don't turn it on,' `It's too early' and `It's going to fail.' He looked at it, said it will work and gave the signal to go'' in May 1986, Young says. STELA was operating in October of that year and has been very popular inside and outside of the agency, he adds. Young says the STELA episode illustrates the importance of decisive action, fast timetables and top-management support. ``The best thing about it all is that the same people who were told three years ago that they didn't have the capacity to do the job _ well, they did the job,'' Mercer concludes. ``I gave them a chance to implement their ideas, I helped with some guidance, and, where necessary, I threw some cross-body blocks to protect them,'' he says. By Mitch Betts, CW staff <<<>>> Title : ATLANTA _ Taking the fiel Author : CW Staff Source : CW Comm FileName: fran Date : Jan 30, 1989 Text: ATLANTA _ Taking the field for the Super Bowl may seem far removed from the realm of managing information systems, but one person familiar with both worlds contends that some principles contribute to success or failure in either. That is the view of Fran Tarkenton, who three times stood in the spotlight shared by yesterday's Super Bowl quarterbacks on his way to National Football League career records for completed passes, passing yards and touchdown passes. Along the way, he also started a management consulting firm, Tarkenton & Conn, Inc., now 19 years old and employing 25 professionals. Tarkenton credits notions about managing people for much of his gridiron stardom and says the same ideas make businesses work better. Tarkenton, 48, who is also chief executive officer of Knowledgeware, Inc., a vendor of software development tools, says MIS organizations in particular need to improve their management of people. ``In data processsing, we are behind in that regard,'' he said in addressing the Society for Information Management's (SIM) annual conference. Tarkenton devotes most of his attention to Knowledgeware, but his counsel to IS managers dwells on managing people. In an interview, two books and speeches to groups including the SIM and Data Processing Management Association, Tarkenton discounts methodologies such as quality circles and management by objective. They are valuable, he says, but are not as critical as managing people effectively, which requires teamwork, accountability and feedback. He acknowledges these bromides are not new, but he emphasizes the need to implement the ideas. A one-time television host, Tarkenton draws laughs rather than yawns in imparting the message to lunching conference-goers by peppering it with relevant anecdotes from his football career. He drives home points in the $15,000 talks with a rhythmic delivery and hushed tones suggesting his upbringing as the son of a Pentecostal preacher in Athens, Ga. ``Technology,'' he told the SIM members, ``only helps us in data processing or [manufacturing] plants when people buy into it and use that technology. We have to have the ability in data processing to manage change.'' People resist change from uncertainty over how it will affect their status and security, Tarkenton says. But leaders can get workers to buy into change by bringing them into the process _ listening, understanding how the change will affect them and making them part of a team. Building teams also helps bring out workers' ideas, and Tarkenton attributes his reputation as a crafty play-caller to taking advice from teammates. Ray Abi, a consultant at Unitech International Corp. in New Canaan, Conn., who arranged a talk by Tarkenton, relates such notions about teamwork to developing systems. Managers have to call a ``play'' and get developers to follow it, Abi says. ``In developing a major system, people often go in different directions. They have to be part of the play. If they go in different directions, the project won't work.'' Tarkenton says employees also need accountability, ``some kind of a scorekeeping system,'' as athletes have a score and statistics. Along with providing feedback, scorekeeping can make work more fun, he says. Tarkenton says reinforcement should be systematic, timely and specific _ as in a manager complimenting a programmer on particular lines of code _ and it should mix constructive criticism with a good deal of positive reinforcement. He promotes incentive-based compensation to get employees emotionally involved in work. Taking risks is another key to success in business and football, Tarkenton says, noting that he also holds the NFL career record for throwing intercepted passes. What about failure? Tarkenton's biggest one, in the view of some observers, is his failure to lead the Minnesota Vikings to a Super Bowl win in his three tries. He says that while the losses are a huge disappointment, he does not let them overwhelm him. By David Ludlum, CW staff <<<>>> Title : Consultants walk a rough Author : CW Staff Source : CW Comm FileName: career23 Date : Jan 30, 1989 Text: For MIS professionals confronted with the constraints and uncertainties of corporate consolidations, independent consulting can present an inviting alternative. In fact, the corporate restructuring is creating a growing demand for consultants, says Craig Bickel, a managing associate at consulting firm Index Group, Inc. in Cambridge, Mass. As firms slim down by reducing in-house staff, they bring in more consultants, Bickel says. ``People are being driven into the [consulting] market just when the same forces are creating consulting opportunities.'' However, the road to success as an independent consultant is strewn with pitfalls. High turnover There is no exact figure on the survival rate for independent computer consultants, but the Independent Computer Consultants Association (ICCA) in St. Louis reports a 25% annual turnover in membership, even though the group continues to grow. Most of the turnover is attributed to members leaving the computer consulting business, according to Jack Christensen, executive director of the ICCA. ``Most people start consulting with their first job already lined up, but then they complete the job and it ends,'' explains Gene Sutton, a consultant since 1976 and president of the Greater Boston chapter of the ICCA. Once the initial assignment is over, the new consultant has to market himself, hustling to get the next job. ``At that point, they fall out pretty quickly,'' Sutton says. He estimates that as few as 25% of new consultants make it past start-up and sustain an ongoing operation. Christensen warns that success in independent consulting takes time. ``If someone is looking at consulting as an interim thing, he shouldn't be doing it,'' he says. Successful consulting demands a psychological commitment, he adds. Otherwise, suddenly being on your own without a steady paycheck can be overwhelming. Last year was a hard one for Seth Metzger from Duxbury, Mass., who turned to independent consulting after working in corporate MIS and then at a computer vendor. After contemplating the move for several years, Metzger was finally pushed into it when the vendor went bankrupt, presenting him with the opportunity to take on some of its customers. But after completing his first assignments, Metzger, who specializes in systems integration and writing mainframe applications, found others were not immediately forthcoming and contemplated dropping consulting. But the picture improved as some proposals eventually brought in business, and he decided to stay with it. ``I discovered that it takes longer than you think,'' says Metzger, who still relies on brokers between work with his own clients. Alex Krazesky from Cambridge, Mass., has been a consultant during the past five years but still has not reached a level of stability. ``I don't really feel I've arrived. It is always a hassle finding work,'' he says. After 15 years of bouncing from job to job in the MIS world, often as the result of an outspoken nature that did not suit a prevailing corporate culture, he decided that he might as well be independent. Krazesky, who takes on mainframe database or telecommunications assignments, also contracts directly with clients but turns to brokers when business is slack. ``If I have nothing lined up, I may call five or six agencies and tell them I have time available,'' he says. Dave Cassell, a computer consultant broker in Houston and president of the National Association of Computer Consultant Businesses, says someone whose long-term direction is technical rather than managerial may do better financially as a consultant, rather than a corporate employee, while also enjoying varied technical challenges. In general, however, consulting is not a way to get rich quick. Some consultants earn handsome salaries, but most of the ones who hang on make the equivalent of or slightly more than their pay in the corporate world, consultants say. Hourly rates range from $40 to $100 or higher, but consultants must spend time at nonbillable tasks such as marketing and administration. Consultants also must absorb the cost of overhead and benefits that employers usually pick up, such as health insurance and paid vacations. Brokers take commissions of as much as 30% of the consultant's fee, although the details of the arrangement are negotiable. A 1986 addition to the federal income tax code, Section 1706, limits the extent to which independent contractors can rely on a broker for regular assignments. Stable and able One consultant who has reached the point of stability is Morris Segal, a partner in Systems Consulting Professionals (SCP) in Alexandria Va., which assists systems integrators with chores from drafting proposals to coding. With three other partners, Segal formed SCP four years ago after the group spent two years developing the business part-time while working for a computer vendor. Unlike most consulting businesses, SCP was successful from the start and has never had trouble sustaining itself. ``We haven't had slack periods. We never advertised, and we don't do much selling,'' Segal says. He attributes the success to the part-time preparation and the experience of the four computer industry veterans with both vendors and MIS organizations. The group's contacts brought business through word of mouth, Segal says. By Alan Radding, Special to CW; Radding is a Boston-based author specializing in business and technology. <<<>>> Title : PC expenses can mount qui Author : CW Staff Source : CW Comm FileName: market23 Date : Jan 30, 1989 Text: ``Complete 20-megabyte hard disk system,'' reads the advertisement. ``IBM compatible, $1,095 includes monitor and free word processor.'' It sounds pretty good. Add another $200 or $300 for a dot matrix printer, spare floppy disks and printer paper, and you are all set. Not necessarily. The fact is, when you install personal computers, you are liable to be very surprised at some of the follow-on expenses. Take printers, for example. ``One thing people overlook is the cost of a good printer,'' says Michael Ferrier, vice-president at Airline Tariff Publishers in Washington, D.C. ``When they start looking, they plan on a cheap dot matrix printer but then get surprised when they decide they have to have a letter-quality, 24-pin, wide-carriage printer.'' Or more. One purchaser of an IBM Personal Computer AT-compatible hoped to use it in his tax preparation business but balked when he learned that his software required a laser printer. That piece of equipment can easily cost more than the computer itself, but because it can print both Internal Revenue Service forms and data, it will save an enormous amount of time by avoiding the need to feed in the forms. Jim Stone at White Swan, Inc., a wholesale food distributor in Fort Worth, Texas, provides some general advice. ``In the very beginning, define a strategy for the use of PCs within the company,'' Stone says. ``Develop corporate PC strategies to prevent, as best you can, each individual going off on his own and securing PCs and software.'' Repairs cost money Maintenance is a common source of expense for PC users. Edward Wyatt, director of the Computer Center at Equitable Life Assurance Society of the United States, also in Fort Worth, takes a cut-and-dried attitude toward the issue. ``Our computers have been so reliable that we simply replace or fix what's broken, and we don't keep a standard maintenance agreement on our PCs,'' he says. But this approach can lead to trouble. ``When we were using a time-and-materials maintenance method, one of our PCs once had to have a board replaced, and that cost $750,'' says Robert Knepp, general manager of Blue Cross Shared Services Center in Lemoyne Pa. ``Now they're on regular maintenance, and that sort of thing is covered.'' For people who decide to get maintenance contracts, there are several types: On-site maintenance. A service person comes to your office and performs the repairs there or leaves a loaner if the repairs will require more time. The repair service may guarantee turnaround in four, eight, 24 or 48 hours, or it may offer different options at different prices. Depot maintenance. You take the computer to the service shop and pick it up when it is repaired. Pickup and delivery. This method falls between the other two in terms of convenience. Maintenance is done at a depot, but someone else takes the computer back and forth. Depot maintenance can get complicated. If a system has a printer problem, for example, you need to take both the printer and the computer to the repair shop unless you are sure where the problem is. On the other hand, on-site maintenance can run from $400 to $1,000 per year per system, depending on system size and complexity. The need to upgrade might provide another unanticipated expense. ``Because software packages change so much, if you don't have the newest PC in the world, you might have a problem running the latest software. And it's costly to upgrade a machine,'' notes Lois Brooks, an accountant at Creative Property Management Co. in Cedar Rapids, Iowa. But wait, there's more Training can provide another expense. However, it seems to be a cost that some people overestimate; TV commercials for Macintoshes indicate that training on IBM PCs and compatibles costs a fortune, but some users say otherwise. ``We've been sending people to classes at the dealer,'' Airline Tariff's Ferrier says. ``These courses are very inexpensive, something like $99.'' Richard Brooks, corporate production executive at Banks of Iowa Computational Services, Inc., keeps training costs down a different way. ``We use the PCs themselves to train people,'' he says. ``We have a bunch of computer-based training software packages we run on the PCs.'' Indeed, Brooks emphasizes the role of the PC as a money saver. ``PCs are kind of like workhorses for us,'' he says. ``They actually in some cases take the role of a human being because they do rudimentary tasks that a human would have to do. So it winds up being cheaper to put a PC in here then it is to hire a person.'' By John J. Xenakis, Special to CW; Xenakis is a computer columnist for The Boston Globe, software editor of the Boston Computer Society's Computer Update magazine and host of a weekly radio show. <<<>>> Title : Finessing the training co Author : CW Staff Source : CW Comm FileName: train23 Date : Jan 30, 1989 Text: Should you have a contract with your training vendors? You bet you should. Are training managers good at negotiations and contracting? You bet they aren't. The corporate contracts department might be able to help negotiate training contracts, but you better lay out the requirements very clearly or the department may go strictly by the vendor's list prices and foul things up. The legal department also might help, but it may take forever to get the contract out. For these reasons, you are liable to find yourself on your own when negotiating contracts with vendors. Therefore, it is important to be aware of steps you can take to protect yourself and reduce your costs. The interesting thing about contracts with training vendors is that the only time they come out of the drawer is when something goes very wrong. Unfortunately, in accordance with Murphy's Law, things that go very wrong often are not covered in the contract. But a contract can get some of the niggling issues out on the table and cleaned up before they become major problems. If you are just going to run a single three- or five-day class or a course on very short notice, you probably do not need an elaborate contract. Most vendors use standard agreements that cover the basics, including such things as the cost of the course, the number of students, which party will provide handouts, which one will pay for travel and living expenses, the place and time of sessions and cancellation fees. Get it in writing If you offer a lot of single courses through vendors, you may want to ask your legal department to draw up a standard agreement for all of your vendors. It can take the form of a letter. The agreement should ensure that the vendor owns the course and is not delivering another vendor's material and that your company is not liable for any injury to the instructor during travel or delivery of the course. The agreement should also call for reasonable travel and living expenses, perhaps with specific standards for air travel, car rentals and taxis, hotel rooms, meals, entertainment, telephone calls and such things as laundry and dry cleaning. An alternative approach is to settle on a reasonable per diem rate and let the vendor fend for himself. The agreement should specify a schedule for the vendor's arrival in the building, the time classes begin and end and the length of breaks. You should require the vendor to get your written approval to use your name in any advertising or to interview and hire any of your employees. You also may want to address necessary security issues such as requiring the instructor to be a U.S. citizen. For the best price breaks and for more long-term relationships, you should consider entering into annual contracts with your vendors. Each one should be unique to a vendor and should probably include all of the items noted in the standard letter agreement. You should also do the following: Negotiate a specified number of teaching days to be delivered during the period of the contract _ not specific courses. This strategy provides the flexibility to modify what you offer to your internal clients should their needs change. Plan a year's worth of classes over the course of your annual budget but set the contract for an 18-month term. This step provides a hedge: It allows you to push courses into the next budget year should something go wrong with your current budget. Do not be specific about dates. Allow the vendor to set the schedule for each quarter with your approval. In return, you should get a price break for providing the vendor with added flexibility. Be specific about the number of students in a class and how you handle overflows. For example, you pay a flat fee for every student over the maximum or you dip into future teaching days. Be specific about when you can cancel a class and what you pay if you cancel beyond the limit. There are some other things to consider to help you get a better rate from the vendor. You do not get a discount unless you ask. You are contracting for a fixed number of days that the vendor can count on, so you should get a discount. Also determine what happens to your rate if you conduct more classes than called for in the contract. There are some other steps that might help reduce the rate, such as printing handouts yourself (perhaps for the vendor to use at other companies as well), providing graphic support, giving the vendor access to hardware to develop new courses and piloting new courses for the vendor. Rent-a-chair You also could let the vendor sell seats in your classes to other companies. You could let the vendor use your classes to groom new or junior instructors or let the instructors attend classes you run on your own. You can offer to provide references or allow the vendor's prospective customers to sit in on your classes. The timing of payments also can make a difference. They might be required within 30 days of the course or up-front at the beginning of the year. The bottom line is that contracts ensure long-term relationships with your vendors. Both parties would like to see a lasting and beneficial partnership, and there are innumerable opportunities to make that partnership successful. The objective of the contract is to provide quality service at a fair price on a regular basis, with everyone clearly aware of the rules. By Bill Sebrell, Special to CW; Sebrell is a vice-president at Data Base Management, Inc., a subsidiary of American Management Systems, Inc. in Manchester, Conn. <<<>>> Title : Let's jail the virus make Author : John Barnes Source : CW Comm FileName: 20barnes Date : Jan 30, 1989 Text: In the last few months, the idea of computer viruses has exploded into the general media and thus into the mind of the man on the street. Much to my surprise _ maybe it's because word processors are now universal in journalism and so journalists take a keen interest in the topic _ reporting has been pretty good. There is an exception, though. The media tends to portray the virus maker as a genius _ usually a twisted, eccentric or egotistical genius _ but basically a genius. If we can re-educate the public on this point, we'll have done a lot to stop viruses. The computer virus maker is supposed to be clever, a brilliant fellow gone wrong, Tom Swift driven by the cruel, misunderstanding world to become Victor Frankenstein. In fact, it's well-known within the industry that most such people are bozos with an excessive need for attention they cannot attract through talent alone. This hurts us in MIS. To begin with, it tends to make high-level policy people who are not computer-literate treat viruses as a technical problem rather than the social one they are. If you think you're facing malevolent geniuses, you're going to divert a lot of intellectual resources to securing yourself against them. Further, because genius is scarce and valuable, you'll aim for leniency toward the ones who get caught, hoping to turn their ``talents'' to good purpose. Steps to take The truth is that there are adequate technical remedies already in existence for viruses. ``Safe computing'' is already here, and it's cheap. It has to do with physical security _ not downloading anything from the outside into a system without knowing where it's from. When there's doubt, it must be subjected to a program that can find viruses. What hasn't been here is the willingness to jail the offender. The punishment must be in proportion to the crime. Damage in the millions of dollars ought to warrant time behind bars. There are a lot of people in the world who are fairly smart but not smart enough to support their own egos. They have grandiose visions, but while they talk well, they deliver little. After a while, these dreamers haven't gotten nearly as far as they think they should have. So they begin to feel it's a nasty, ugly world, one that knocks things apart for no reason, and naturally enough, they develop an ugly desire for random destruction. If they've been told that their unpleasant activities are works of genius, that lets them reclaim some of their lost ego inflation: ``Poor, misunderstood genius me, I'll show them how clever I am.'' What we need to get across to the public is that viruses are the products of second-rate minds with first-magnitude grudges. If you find yourself interviewed on the subject, especially after an incident at your facility, let me suggest an example that I've found useful in explaining things to computer-illiterate and computer-phobic friends: Suppose you have a robot in your kitchen that does all the cooking. The robot reads instructions you write on file cards, one instruction to a card, such as ``Preheat oven to 375 degrees'' or ``Add garlic and onion and simmer for another half hour.'' A benign virus is the equivalent of a card that says, ``Make a copy of this card and put it in the deck of another recipe.'' If it's less benign, it might be three cards, saying, ``Make a copy of this and the next two cards, changing any name of a day to the next day, and put them in the deck of another recipe. If this is Tuesday, empty the catbox into the food. If you have emptied the catbox into the food, pull out this and the previous two cards and burn them.'' If you want to devise other examples of recipe viruses, you'll find it's easy _ because you hardly have to know a thing about programming. The person who wrote the virus does not need to know how to cook. He doesn't need to know anything much about the kitchen or even about the robot. He is, in short, not much more than an annoying vandal. By controlling who gets into your kitchen, you can control him. Who are the really clever people? The ones who can write programs that catch viruses. I've seen several such programs in action and looked at their operation in detail, and I'm impressed. Imagine _ to return to the example _ a set of file cards that would tell the robot how to search a recipe for viruses, find them and discard them. You'll quickly see how much more intellectually demanding it is. If we can get the idea out to the public that a virus maker is not a genius, not a twisted lonely soul and much more a nuisance than a menace, we'll have robbed him of his chance to see himself as a hero and his power to frighten upper management into expensive, unnecessary crash programs. By John Barnes; Barnes is the Pacific Northwest area manager at ADG, a high-tech marketing organization based in San Pedro, Calif. <<<>>> Title : Expert systems: Quiet her Author : CW Staff Source : CW Comm FileName: aia Date : Jan 30, 1989 Text: The term ``expert systems'' is increasingly misleading. Rather than replace an expert, more and more of these systems are developed to help all kinds of employees by handling routine chores. At MCI Communications Corp., expert systems are used to ``make everyday systems easier to use,'' according to Dan DeSalvo, a manager in the Advanced Technologies Group. One expert application was designed to assist the many marketing and sales people within MCI who need an array of constantly changing customer and product information. The Commercial Prospects Advisor was developed using Artificial Intelligence Corp.'s Knowledge Base Management System to simplify the task of querying a database. The result is that users at a terminal do not need to know whether the information lies in a database built with Adabas or DB2. The result has been to disseminate information more widely, according to DeSalvo, because an obstacle to users has been removed. Ultimately, the system will make more than one million records easily accessible to more than 1,000 users. Furthermore, the Commercial Prospects Advisor took only five months to build and will be easier to maintain than a conventional application, DeSalvo stated. Emphasizing the application's ease of use, DeSalvo points out that the only user manual is a 3-by-5 card detailing the logon procedure. Down to earth Tom Schwartz, president of AI market research firm The Schwartz Associates in Mountainview, Calif., concurs that artificial intelligence technology brings a down-to-earth benefit, boosting productivity and lowering costs. Because AI allows programmers to deal with problems at a high level of abstraction, it creates systems that are easier to develop and maintain than a program written in a procedural language such as Cobol. The rules-based design of expert systems will allow applications to be changed much faster as requirements change, he said. In keeping with the philosophy of using AI to simplify everyday tasks, MCI has used expert systems to automate jobs it understands very well, thereby freeing up an expert's time. One such application involves managing microwave transmissions, a major task at MCI. Microwave signals frequently have to be rerouted and switched to a spare transmission path. To manage this, data on microwave signals is monitored and fed to a surveillance center. The work of poring through this incoming data and deciding which signals need to be switched to another path was a full-time job. Today, an expert application developed with Intellicorp, Inc.'s Knowledge Engineering System, called the Switch to Spare system, classifies the data and identifies signals that must be rerouted. At American Mutual Family Insurance Group in Madison, Wis., the primary focus of expert systems are, like MCI's, not esoteric but everyday applications, according to Herb Thompson, development support supervisor. One expert systems application, the Equity Data Calculater, calculates the return amount due a customer after insurance has been canceled because of a bounced check. Before this system came on-line a month ago, employees had to get information from three different databases and use a calculator to figure out the prorated value of a premium and, ultimately, the amount of the return. In addition to ensuring that money is not returned inappropriately, Thompson estimates that at least 10 minutes of processing time is saved for every situation in which insurance is canceled because of a bounced check _ about 150 times a day. Users can access the application, which runs in an IBM MVS/XA environment, from standard menu options displayed on their terminals. Using expert systems techniques has drastically cut development time and simplified maintenance, Thompson said. By Amy Cortese, CW staff <<<>>> Title : DEC offers automatic watc Author : CW Staff Source : CW Comm FileName: decmon Date : Jan 30, 1989 Text: MAYNARD, Mass. _ Most systems managers would probably agree that if finances allowed, they would post a 24-hour guard at every computer site. Digital Equipment Corp. may have come up with the next best thing. The recently introduced Environmental Monitoring System (EMS) is a microprocessor-based electronic watchdog that uses up to 112 external sensors to provide warnings against data-threatening abnormalities such as fire, flooding, extreme temperatures, excess humidity or security breaches. The system can be used in place of security personnel to cut operations expenses, according to Sara Williams, DEC's environmental products manager. The EMS surveillance system responds with a visual display and an audio announcement on its connecting terminal whenever real-time samplings of these changes exceed user-established thresholds, DEC officials said. EMS sensors can be located up to 5,000 feet away from the command-post microprocessor, DEC said. Users can also customize their levels of response. A first alert to a computer room exceeding its temperature, for example, may be for a warning bell to ring at a central command post. If the temperature rises a few degrees, the whole system could automatically shut down. The EMS can work as a stand-alone unit, be daisy-chained to seven additional computer installations at a site to form a monitoring network or be hooked via a modem to a system manager's home, DEC's Williams said. Environment watch An optional software package _ VAX Remote Environmental Monitoring Software (REMS) _ gives an operator the ability to monitor data garnered from an entire network of environmental-monitoring systems. REMS provides a continually updated database on such things as power conditions, temperature, water and the status of security systems, DEC officials said. When REMS detects an abnormality, it automatically relays alarm signals, sends electronic mail and activates preprogrammed defenses. If one power supply fails, for example, the software can automatically switch computers to another source while simultaneously notifying systems personnel, said EMS product manager John Yurcak. A basic EMS starter package including four on/off-type switch sensors that can be connected to a fire alarm or sprinkler system, one water detector, two temperature sensors and an output relay sells for $5,995, DEC said. A REMS package costs $3,650. By James Daly, CW staff <<<>>> Title : Japan chip plans include Author : IDG News Service Source : CW Comm FileName: japanchi Date : Jan 30, 1989 Text: TOKYO _ Two of Japan's giant electronics firms announced this month commercial plans for new microprocessor developments. NEC Corp. announced the development of a general-purpose 32-bit microprocessor capable of up to 15 million instructions per second (MIPS) and features an internal clock speed of 45 MHz. Matsushita Electric Industrial Co. said it has developed a 64-bit reduced instruction set computing (RISC) ``superchip,'' for use in parallel processing. NEC researchers employed a ``silicide-gate technique,'' which decreases electrical resistance in the circuit, to integrate 385,000 micro devices on an 8.34mm by 8.28mm chip, according to a company spokesman. Sample shipments of the low-end version running at 33 MHz and processing 11 MIPS are expected to be available this fall. Matsushita said it will release specifications of its new chip in February at the International Solid-State Circuit Conference '89 in the U.S. While Digital Equipment Corp. and General Electric Co. are also expected to announce similar chips then, the Japanese consumer electronics firm is looking to get a jump on superchip competition by starting commercial production of 32-bit RISC chips now. <<<>>> Title : Korea sees chip sales dou Author : IDG News Service Source : CW Comm FileName: korea Date : Jan 30, 1989 Text: SEOUL, Korea _ The Koreans are finally coming. With the U.S. semiconductor industry still reeling from Japanese competition, the major South Korean semiconductor makers are setting aggressive export targets this year. Samsung Electronics Co., Goldstar Semiconductor Ltd., Hyundai Electronics and Daewoo Telecom Co. have set an ambitious semiconductor sales goal of approximately $2.65 billion for 1989, almost double the size of last year's sales of $1.4 billion. A local industry analyst said that thanks to the expanding global semiconductor market, particularly in Southeast Asia, all four vendors have increased their production capacity of 1M-byte dynamic random-access memory (DRAM) chips since last year. With sales of almost $1 billion in 1988, Samsung has set a 1989 sales goal of $1.64 billion. The firm plans to sharply increase its 1M-byte DRAM manufacturing capacity to 5.5 million units per month from the current 1.5 million with the completion of its 1M-byte DRAM production line early this year. It also plans to increase efforts to make 1M-byte static random-access memory chips. Goldstar said it expects to ship $534 million worth of memory chip and application-specific integrated-circuit products this year, almost triple its total 1988 export sales. Chip sales for Hyundai Electronics are expected to reach twice that of its 1988 chip sales of $223.9 million. The company will rely on the production of 256K-byte DRAM chips and is expected to start mass-producing 1M-byte DRAM chips in the second half of the year. Daewoo Telecom shipped $5.3 million worth of semiconductor products last year and plans this year to sharply increase shipments to $29.1 million, with a target of $10.9 million from overseas sales. <<<>>> Title : Friday 13h worm gets DEC Author : CW Staff Source : CW Comm FileName: decworm Date : Jan 30, 1989 Text: While several companies in the U.S. and abroad were reportedly struggling to quash the newest strain of Friday the 13th virus that hit computers a couple of weeks ago, Digital Equipment Corp. engineers were chasing down a worm. Unlike a virus, which must attach itself to a program to execute, a worm can replicate itself, often to the point of overloading a computer's memory until it can no longer function. A company spokesperson said that a worm was put into DEC computer systems the evening of Friday, January 13. ``The systems manager noted the worm almost as soon as it came in,'' said Nikki Richardson, a company spokeswoman. While the worm did not disable any of its systems, DEC engineers uncoupled some connections between systems while they concocted a vaccine to stamp out the worm. The vaccine, which was designed to follow the worm's trail, cleared the affected systems of the worm by Monday morning, Richardson said. DEC officials refused to speculate on the origin of the worm or to reveal how many systems were affected or where they were located. ``It's an internal investigation, and it will be finished when it is completed,'' Richardson said. By Michael Alexander, CW staff <<<>>> Title : Strife hits industry in f Author : CW Staff Source : CW Comm FileName: earn4 Date : Jan 30, 1989 Text: Intense competition and increasingly demanding and sophisticated users have combined to create an era of intolerance in the computer industry _ and companies from venerable NCR Corp. to entrepreneurial Apple Computer, Inc. are paying the price. Traditionally one of the industry's reliable slow and steady gainers, NCR attributed its fourth-quarter stumble to declining domestic orders. Moreover, the firm cautioned that falloff in second- and third-quarter year-to-year orders bodes badly for overall growth prospects in the first half of 1989. Domestic sluggishness did indeed impact NCR, said William Easterbrook, an analyst at Kidder, Peabody & Co. To that extent, he said, the bad news is extrapolative: Substellar U.S. performance was a recurring refrain in computer company earnings reports for the December quarter. However, Easterbrook added, ``NCR's orders were soft because they haven't had any blockbuster products in the past six months, and the new products they had, they got out late.'' Apple got an object lesson in the meaning of market intolerance when its fourth-quarter numbers, while impressive, showed a slip in profit margins. The shortage-induced high cost of dynamic random-access memory, said Steven Ossad, an analyst at County Natwest Securities USA, contributed to the slight shift in gross margins. More significant, Ossad said, were price increases instituted by Apple last autumn. ``That was the real culprit,'' he said. ``The prices were so high that buyers moved to less-rich configurations.'' Cutbacks hurt, not help Hoping to tempt customers back, Apple last week cut some of the earlier increases and decreased the prices of certain add-ons for the affected Macintosh computers. Far from being reassured, ``Everyone apparently focused on the price cuts and saw them as indicating a slowdown in demand for the Macintosh line,'' noted David Soetebier, an analyst at A. G. Edwards & Sons. Immediately following the announcement of the earnings report and news of the price rollbacks, Apple stock plummeted 3.38 points. ``They got shellacked,'' Ossad said, calling Apple a strong company ``still achieving staggering gains.'' Other analysts agreed. Diminishing sales of older products and production delays in the new VS 5000 line handed Wang Laboratories, Inc. _ and Wall Street _ a shocker: earnings down 97% in the December quarter. However, said Jeff Goverman, an analyst at Soundview Financial Group in Stamford, Conn., ``People are more focused on futures with Wang.'' Orders for the delayed VS 5000 are strong, according to the company, which also said that it already has booked 68 orders for a high-end minicomputer scheduled for announcement this week. Fault-intolerant users, on the other hand, delivered a strong quarter to the company that delivered hotly demanded fault-tolerant products to them. Tandem Computers, Inc. turned in an impressive fourth-quarter performance, flouting several industry cliches in the process. In a period marked by sagging U.S. sales, Tandem logged its second consecutive quarter of year-to-year growth in its domestic com- puter business, according to President James Treybig. While some companies have complained of corporate indigestion following large acquisitions, Tandem announced that recent acquiree Ungermann-Bass, Inc. exceeded its expectations. Magic? Hardly. Tandem's secret, Goverman said, is simple: give the people what they want. ``Tandem is in a high-growth area: OLTP for mission-critical systems,'' Goverman said. ``When Tandem talks to a customer, they go in talking OLTP and fault-tolerance, not `look at our whizz-bang technology.' They sell the area, not the machine.'' Timely supply to a demanding market also boosted Norwood, Mass.-based Phoenix Technologies Ltd. into a strong quarter. In a prepared statement, Chief Executive Officer Neil Colvin attributed the company's 100% profit surge and 67% rise in revenue to continued strength in sales of IBM Personal Computer compatibles and returns from Phoenix's line of system software products geared to the booming workstation and peripherals markets. Expected price erosion in the company's ST-506 and 380M-byte enhanced small device interface and small computer systems interface product lines set off an earnings dive-bomb at San Jose, Calif.-based disk drive vendor Maxtor Corp. This fall offset a more than respectable revenue rise and prompted the company to predict shrinking gross margins and depressed earnings potential in the near-term. In a phrase worthy of first-ballot entry into the oxymoron hall of fame, CEO George Scalise declared that Maxtor is ``in a period of profitless prosperity.'' Oxymoronic but accurate, County Natwest's Ossad said: Despite the earnings debacle, Maxtor ``is still the best-positioned disk drive company in the industry. Their balance sheet is improved, they've shown very rapid growth in new products and they made money.'' ``The computer industry is in transition, and it isn't clear where the transition is leading,'' said Martin Ressinger, an analyst at Duff & Phelps, Inc. The one sure bet is there are no sure bets any more, he said. ``Not only does a company have to have new products and get them out on time, it also has to guess right with regard to what the customer's going to want by time the company does get them out. That can be the hardest part.'' By Nell Margolis, CW staff <<<>>> Title : Ex-Unisys exec joins Nort Author : CW Staff Source : CW Comm FileName: stern Date : Jan 30, 1989 Text: MISSISSAUGA, Ontario _ When it comes to careers in multinational companies, former Unisys Corp. President Paul G. Stern literally wrote the book: Still untitled, it will be published by Warner Books, Inc. Now the 50-year-old veteran executive is about to add a new chapter. Effective March 1, he will become chief executive officer of communications giant Northern Telecom, Inc. Stern, who has been serving Northern Telecom as a board member and consultant since last spring, will take on the firm's chairmanship upon current Chairman Edmund Fitzgerald's planned retirement in April 1990. How will the new head of Northern Telecom proceed? ``Very carefully and very fast,'' he said in an interview last week. Borrowing a phrase from President George Bush, on whose campaign finance committee he recently served, Stern said that he intends to stay the course already set for the $4.9 billion firm, making sure that planned new products are well positioned in an increasingly global market. Stern's previous corporate stops, before joining the computer industry with former Burroughs Corp., included Du Pont Co., Rockwell International Corp. and Gillette Co. Stern has also been around Washington, D.C. _ another experience he expects to turn to his new company's benefit. With respect to both commercial products and defense contracting, he said, ``I know how to deal with bids, get export licenses, things like that.'' By Nell Margolis, CW staff <<<>>> Title : Ax falls at National Semi Author : CW Staff Source : CW Comm FileName: chips1 Date : Jan 30, 1989 Text: The chips are falling in price and profit margins _ and so are 2,000 employees at National Semiconductor Corp. National Semi announced last week that it will lay off 2,000 employees worldwide, or 5% of its labor force. This follows a layoff of 450 in August in its Datachecker Systems, Inc. and National Advanced Systems (NAS) subsidiaries. Other U.S. chip giants Intel Corp. and Advanced Micro Devices, Inc. (AMD) reported weaker financial results last week, showing evidence that the long-predicted down cycle in the semiconductor industry is at hand. Although sales may be growing, lower per-unit prices are pressuring National Semi and the industry in general to shrink operations, according to Millard Phelps, an analyst at Hambrecht & Quist, Inc. ``I expect [National Semi] to save between $32 million and $36 million in expenses by the end of its fiscal year,'' which is March 30, Phelps said. Cash infusion needed The layoffs and an infusion of cash from the recent sale of its retail systems company, Datachecker, for $90 million and the sale of half of its mainframe company, NAS, for $250 million to Memorex Telex N.V. should help National Semi's bottom line. In the last two quarters, the firm reported a net loss of $55.7 million. Despite the massive work force reduction, all 22 National Semi facilities worldwide will remain open, according to a company spokeswoman. AMD reported a 5% drop in fourth-quarter sales that contributed to a $34.1 million loss. The loss included a one-time charge of $17.3 million for the Sunnyvale, Calif., firm's restructuring, which included the previously announced layoff of 2,400 employees. AMD still managed to post a profit of $19.3 million for the year, a dramatic turnaround from a $64 million loss in 1987. After the acquisition of Monolithic Memories, Inc., AMD reported sales of $1.13 billion, up 13% from 1987. Like AMD, Intel saw a strong year slow down dramatically in the fourth quarter. In Intel's case, however, it was a matter of stockpiled supply exceeding demand for the 80386 microprocessor, which battered profit margins. Despite a 27% sales increase to $727 million, Intel's fourth-quarter earnings fell 10% to $86 million. For the year, profits, fueled by strong demand for the single-sourced 80386 early in the year, still grew 83% to $453 million on revenue of $2.9 billion, up 51% from 1987. Bucking the trend of the larger suppliers was Chips and Technologies, Inc., whose healthy growth continued in the second fiscal quarter ended Dec. 31. Earnings jumped 41% to $7.9 million on revenue growth of 68% to $54.4 million. Senior Editor Clinton Wilder contributed to this report. By J.A. Savage, CW staff <<<>>> Title : In brief Author : CW Staff Source : CW Comm FileName: 123week Date : Jan 30, 1989 Text: Atherton snatches Goldberg Computer-aided software engineering startup Atherton Technology in Sunnyvale, Calif., has snagged 24-year IBM veteran Arthur G. Goldberg as its president and chief executive officer. Goldberg, 45, was most recently director of business development at IBM's AIX Systems Project Office. Before his tenure in that department, he was director of workstation systems in the Entry Systems Division. Goldberg is considered to have been a key force behind the recent comeback of IBM's RT workstation. Unisys to sell Convex CPUConvex Computer Corp. has discovered a new distribution channel _ Unisys Corp. _ in Brazil. In a multiyear agreement, Unisys' Brazilian subsidiary agreed to sell and service Convex minisupers in Brazil. The agreement marks Convex's entry into the Latin American market. Perot gets active again H. Ross Perot, whose Perot Systems Corp. has been quiet of late, has himself been active on the investment front. Archive Corp., a Costa Mesa, Calif.-based manufacturer of -in. cartridge tape drives, confirmed last week that Perot, through a private investment group, has agreed to fund technology development at Archive in exchange for rights to purchase approximately one million shares of Archive stock. The actual amount of the Perot investment was not disclosed; Archive stock was trading at roughly $8 per share last week. Unix consolidation There will be consolidation in the growing niche of Unix systems software with the forthcoming acquisition of Taskforce Software Corp. by AIM Technology for an undisclosed amount. Santa Clara, Calif.-based AIM Technology specializes in Unix performance measurement and management software. It's official nowOracle Corp. and AT&T have formalized the sale of Oracle software on AT&T computer systems with an OEM agreement announced last week. AT&T sales representatives will sell Oracle products with AT&T's entire hardware line. Oracle currently holds similar agreements with other systems manufacturers.