Monday, August 31, 2009

Keywording Lemons - How To Avoid an Inflexible In-house Keywording System

Versatility is the key to a good in-house keywording system, so try not to get hooked into software that limits your options.

Keywording is one of the most important determinants of image sales, yet in software packages designed to run image web sites or provide back end support for photo libraries keywording is often a very basic add-on. Worse, they can be extremely complicated affairs which are difficult and time-consuming to use.

If you ever want to change your keywording system to improve or change the type of keywords you have, or to try outsourcing, you may find yourself unable to do so without spending large amounts of money.

And if you ever decide to change to different web site or back end software, you may find your keywording cannot be transferred to the new system except at great cost.

Many such systems have been designed with little thought as to the cost of entering keywords, the standards of vocabularies and so on. After all, the people writing the software often aren't keyworders and may well have little experience of this unusual science.

Sometimes the keywording interfaces have been designed assuming that the people adding the keywords will be untrained and so limits need to be placed on how words are entered in order to achieve consistency. That has superficial appeal, but in our experience may limit the poor keywording only marginally. More importantly it may limit the ability for professional keyworders to do their job should they be employed in the future.

To help you evaluate the keywording aspects of image management software we've come up with this checklist. If you don't know the answers to these questions, you'll need to find out from your service provider :

1. Where are your keywords stored, and how easy are they to move?

If the keywords are held within IPTC fields of each image, or a database that can easily be exported to a spreadsheet then you are in the best position to move elsewhere or change procedures. If you are locked into using only the database of the service provider, or to export the keywords is costly, then you have a major problem in the making.

2. Does the keywording interface include a lot of check-boxes and drop-down menus to select keywords?

Check-boxes and drop-down menus in themselves aren't a problem, except that if not designed properly they will slow down keywording enormously. We once saw some software written for a client which had hundreds of check-boxes to choose from. It took forever scanning these to make a selection - mindbending and slow. Likewise, drop-down menus of single words are very slow to use. These devices also tend to make if hard to change vocabularies or word selections without lengthy updating of databases, interfaces or the software itself.

3. Can you drop keywords into a single keywording window?

Even if check-boxes and drop-down menus are used, having this feature will give some versatility. It allows different (ie more efficient) systems for generating keywords to be used independently of the software, with the resulting keywords pasted into the field.

4. Do you have to keyword one image at a time?

It seems like the obvious way to keyword images, but it's also the slowest way to work with images. Doing a number of images at once is far faster and tends also to be more consistent and accurate.

5. How easy is it to change keywords already written?

We are often asked to fix up poor keywording, and clients can be frustrated to find out that there is no easy way to do it without ponderously re-keywording using the software they already have. The ideal situation is that the images can be sent for revising with the keywords in IPTC. These can then be amended and resubmitted into the database with the resulting changes overwriting what's already there. Overwriting by re-keywording into a spreadsheet which is imported back into the database is also excellent. Too often, clients are told this cannot be done at all, or only with major, costly, rewriting of the software. Make sure easy overwriting is standard and you'll avoid headaches.

6. How is your keywording vocabulary organised?

To maintain consistency you should be using a vocabulary to assist in selection of keywords. At some time in the future you are likely to want to change that vocabulary, integrate it with a better one from a third party, or replace it altogether. How easy is it to do that? If the vocabulary can be exported and re-imported as a text document or spreadsheet this should be relatively easy. If the vocabulary is held in a series of fields inside the service provider's database and can't be easily exported, you are looking at a lot of time and money to rectify the problem.

7. Are you able to add or amend keywords any other way than by using the service provider's software?

This is the bottom line when assessing this sort of software for keywording purposes. If the answer to the question is "no", then outsourcing, major vocabulary upgrades, or moving to a better service provider are going to be painful if not downright impossible. Even if the answer is "yes", you may still be up for the cost of messy workarounds. And if the method of adding keywords with the existing software is slow, you will keep on paying the price for that slowness with each new image.

By Kevin Townsend, KEEDUP
Link to original article: http://keywordingcentral.blogspot.com/2009/08/keywording-lemonshow-to.html

Thursday, August 27, 2009

What is stopping you from investing in a DAM solution?

A few months ago, I posted a poll to gauge the main reasons why some organizations have not invested in a DAM yet. Aside from the needed awareness about what is a DAM and why would an organization even need a DAM which I explain on this blog, I listed a few common answers (aka excuses) based on what I have hear from several other organizations who either had issues implementing a DAM or still have issues implementing a DAM to date.

  • That involves work, so we’ll continue doing things the same way we’ve been doing it since 1989.”

Charles Darwin said, “In the struggle for survival, the fittest win out at the expense of their rivals because they succeed in adapting themselves best to their environment. The modern translation in business would be ‘Adapt to the 21st century methods of doing business or watch your business swirl around the bowl’. Yes, implementing and continually using a DAM will take some work and consistency, possibly more consistency than your organization had beforehand. Consistency is a good thing and will give you more time for creative projects and evolving into 21st century methods of doing business. And you will also need to upgrade that 20 year old technology so it can work for you rather than you working for it.

  • We don’t reuse anything…ever

If you really don’t reuse anything, you probably have a very special business that wastes a lot of time, money and other resources. Once you truly realize what your organization can reuse, you should be able to make more money (eventually), but you can not make more time…ever. I am not saying you need to be as ‘green’ as grass, but you should start thinking out of side of that small mind set which typically ends at your desktop and watch what your organization re-creates and/or re-acquires constantly. That is what you can save, make available, reuse and track with a DAM. That can help save your organization some green (aka dollars).

If an organization can’t find assets again, they may not know anything about their own assets either. With a DAM that has proper metadata, you now can know about what you have at hand and possibly the licenses for these assets. Get more out of your organization’s intellectual property more often than simply the first time it was created or acquired.

  • We don’t have the staffing because we are too busy looking for things we can’t find

You can watch institutional knowledge walk out the door with the knowledge of where and how to find things:

    • buried in some pile with a magically filing system only one person knows about
    • within a folder structure 50 levels deep
    • on their desktop (the ultimate silo)

People with institutional knowledge walk out your doors voluntarily or (even more frequently nowadays) involuntarily. What are you doing to preserve the institutional knowledge and the intellectual property of your organization when people move on? Where is it kept since day one? Hint: Using a DAM is a start.

  • We have too many people asking for this thing, so we’ll do nothing about it

You can continue procrastinating. It’s only your business. Do you need know what you have and what you don’t? Does the rest of your organization know this too? Does everyone know where to find what they need at that moments’ notice?

  • We can’t spend on anything, including tools that will save time and money

It is your business and your money. Are you running low on matches yet? Or do you audit your procedures when dealing with digital assets? You have standard workflows to handle it all, right? Everyone knows where to look to find what they need? It’s all documented for continuity, right?

  • It is too expensive and we prefer burning money on duplicate production time

Are you loosing employees in this economic climate? As employees leave your organization for any reason, they leave with that institutional knowledge with them. It should not take a large group of people lots of time to find or create assets, particularly assets your organization already has. How do you capture the assets (i.e. audio, graphics, presentations, photographs, text, video) they have created or acquired for your organization? Are you going to:

  • Wait til their hard drive is purged after they leave the organization?
  • Print out all emails and read them?
  • Burn a CD only to lose it in a few days or get it scratched?
  • Video tape them babbling about way back when…

No. Save your organization’s assets, time and money. Get a DAM solution for your organization’s business needs for now and the future. How much time will your organization waste recreating or reacquiring the same thing again? DAM helps put an end to that and bring more consistency using workflows for assets.

How many times do you use your organization’s logo? DAM could report on that and make sure it look right each and every time…unless you like brand inconsistency. Also, there are more deals to be made in such an economic climate, so it is worth looking into.

  • We don’t believe it can help us because we don’t have any digital assets
  • We don’t need to manage digital assets…yet

This is only possible today if:

  1. your organization still operates without computers OR
  2. the organization is in denial (I am not referring to the long river in Egypt)

The longer it takes an organization to realize they are accumulating digital assets in silos and not managing them properly, the harder it will be to manage those legacy digital assets in the future.

  • [DAM] involves workflow and we don’t have those

Start with whatever procedures you do have when it comes to dealing with digital assets from start to finish. Who does what when and how is it done? Document this in writing. Consolidate the procedures. Streamline where possible and standardize it. What is missing?

  • People?
  • Process?
  • Technology?
  • All of the above?

Get feedback and fill in the gaps. You’ll see for yourself what you need. Then, communicate this to stakeholders with clear solutions.

If an organization doesn’t have standardized workflows, not only does that make it harder to train anyone new within the organization, but reveals that there is probably very little consistency and lots of waste. It may be time to create and to document the workflows within the organization. It is not a matter of job security (keep dreaming if you believe that actually exists in the 21st century). It is a matter of having procedures that makes business sense.

  • We are organized enough and we have no room for improvement nor streamlining

Reached perfection already? Come on! Get your head out of the clouds or wherever else it may be buried. Every organization has room for improvement. As soon as you stop improving your organization, expect to stop growing soon and continue throwing money out the window. Implementing a DAM solution properly can be one of those improvements, if done right.

  • We prefer sending hard copies of all assets and paying for postage to mail all of it slowly around the globe

That does not need to happen any longer unless you like wasting time waiting. You can send a soft copy (aka electronic copy) of anything around the globe in seconds by email, by FTP, by URL to a download link or multiple other methods, even if it needs to color proofed, printed, signed or whatever. Stop everyone’s wasting time. DAM can be connected to many methods of communication in order to transfer digital assets inside or outside your organization as needed.

  • We can’t convince the stakeholders in my organization because it involves technology they don’t understand

Document the cases in writing of what happens in your organization today:

  • What happens when it comes to dealing with digital assets from start to finish?
  • Can everyone access what they should access when they need it?
  • Can they find what they are looking for when they do have access?
  • Do you know what licenses are for those assets?
  • Can you easily reuse assets when needed quickly?
  • How much is spent when everyone can not do this effeciently and effectively?
  • How often does this happen?
  • Where are the gaps in the process?
  • What could fill in those gaps aside from spending unlimited time?
  • Who will lead the effort toward resolving the issues if you get the backing of the stakeholders?
  • Be prepared to submit a budget and time estimates so do your homework and research it.
  • Ask stakeholders them what they need clarification on.

Unless stakeholders are asleep at the wheel of a sinking ship, they may eventually show interest in backing a plan to resolve these issues, particularly nowadays if the ROI can be justified.

  • We are not organized enough to start a project like that because we are beyond help

Get some help from outside your organization. It may be time to start getting organized because it is not going to get any easier if you wait longer.

  • Our IT department does not want to support yet another application

If you have an IT department, get them involved from the beginning of the process of picking the right DAM solution for your organization. Not all DAM solutions require extensive IT work within an organization.

  • We find metadata scary so we’ll just stick to folder structures to help us lose our assets

You didn’t really need to know what you actually have nor where to find it without a map, right? That awesome file naming convention tells you everything about those assets you’d ever need and helps you find the asset too, right? Wake up. Find out why you need metadatahow to apply metadata to assets. Your organization should uniformly decide what metadata standard(s) you need to use and follow consistently. and

  • “...Hold on, I am still thinking of some more excuses

Do you see that bond fire? Your organization is throwing money into it. As soon as the organization decides they want to join the 21st century because carving stone tablets is no longer profitable within most businesses, they might start looking for a DAM solution.

  • We are still thinking about it and will continue to think about it for years as we accumulate more assets”

Again, it is not going to get easier if you decide to wait longer. You will continue accumulating more digital assets every year. You can continue being unable to find assets, loosing assets and burn money re-acquiring or re-creating them repeatedly. Your organization needs to make the conscious decision when they had enough of this and decide what solution they have to this issue. Stop accumulating digital assets is not one of those options anymore unless you like throwing more money away repeatedly.

  • What is Digital Asset Management again?

You probably have not read this blog very long nor the ones in my blogroll on the right if you are still wondering what is a DAM. Start reading and learn how DAM can help you and your organization.

Now that you have a few less excuses to use, don’t take it personally, but rather do something about it. What is stopping your organization from investing in a DAM solution?

By Henrik de Gyor on August 26, 2009, Another DAM bLog
Link to original article: http://anotherdamblog.wordpress.com/

Cost of storage for ECM & DAM: Part 2

I recently posted a piece regarding storage costs for ECM that seemed to garner some interest, so I thought I might just flesh out some of the assumptions I made in that a little further. The basic premise was that people who buy ECM and DAM systems tend to underestimate the cost of related storage, and typically do so by a wide margin. In addition to underestimating the costs, buyers typically underestimate the volume of storage required. Combined, these miscalculations can, and often do, prove to be very costly.

In my experience these miscalculations are generally due to either a misguided assumption that storage volumes and costs are falling to the point whereby they are not worth worrying about, or that the buyer has previous experience with a web content management system, and assumes that as storage was a minor issue then, so it will be a minor issue with a full blown DAM or ECM implementations. The reality is, nothing could be further from the truth.

First, some basics. Although the cost of 1GB of storage has plummeted over the years, the cost of managing the stored data has not. In fact enterprise storage costs have continued to rise year on year. It is easy, though wrong, to equate the cost of disk space with the overall cost of storage. Disk costs represent a very small part of the overall storage bill. Managing the structure, security and access to and from that stored data is what costs a lot more.

Even as disk space costs have plummeted, our appetite for filling those disks has grown at an even greater rate. ECM and DAM storage costs have risen more than most as they both manage bulky content files. The increased use of rich media, PowerPoint, Flash files, video, audio or even just the use of graphics in typical office documents has bulked up storage demands way beyond anything one could have predicted just a few years ago, lifting many multi-terabyte situations to the petabytes today.

As the sheer volume of content being stored has grown exponentially, so too has the realization that hidden amongst these volumes are actual items of real business value, and/or items that could get us into trouble if lost (or found). The need to address such related issues as backup for basic protection, disaster recovery to ensure that we can survive if everything gets hit in a single location, and of course archiving, ensures that we can separate and actively manage important content over the long term. Enterprises must do this to meet compliance and legal needs. All of these of course add considerably to costs, though they do enable content owners to sleep well at night.

Some buyers just want to push it all to "The Cloud" and if that works for them, great - but that is not necessarily a low-cost option. As a rough guide, 1 petabyte of storage will cost you around $150k per month using Amazon S3, yes, that's $1,800,000 per year. Of course one can argue that "The Cloud" does all the DR and Backup work for you so there could be cost savings there, but its still not exactly cheap. Yes I know not everyone will need a petabyte of storage, but my point remains valid, as you will likely need far more storage space than you think you do. Whether you end up with a fiber channel SAN courtesy of NetApp, Hitachi or EMC or you opt for the Cloud - you are going to pay out a lot of money.

But hold on a second: surely this all assumes that such costs are inevitable and indeed necessary, that the only error is the fact that you, the buyer, underestimated them? In fact the major error here is that most buyers of ECM and DAM systems are not thinking about using storage systems in the way they were designed to be used. Theoretically at least such systems allow you to clear out junk (irrelevant, duplicated or redundant) on an ongoing basis, and only manage key data or files. This behavior is sorely lacking from our content management routines. Moreover, better systems integrate well with most common storage options, providing fairly seamless retention and disposition management, in some cases even going so far as to help in the automation of tiered storage. But few buyers ever make any use of these features and they become little more than electronic buckets, buckets that get filled in random order.

So, what is the lesson here? Well maybe there is more than one lesson, for starters:

  • You should always ensure that accurate storage calculations are an early and important of any ECM or DAM project
  • Put proper content governance in place to ensure you're not paying for space you don't need (Consider that a business case for ECM and DAM can often be made simply based on the savings derived from an efficient retention and disposal process)
  • Finally, the next time you hear somebody say that enterprise storage is getting cheaper and cheaper, hit them, they deserve it.

By: Alan Pelz-Sharpe, Analys, CMS Watch, 24-Aug-2009
Link to original article: http://www.cmswatch.com/Trends/1671-Storage-for-ECM-DAM-Part-2?source=RSS

Reblog this post [with Zemanta]

Storage costs for ECM and DAM Systems

We have had an interesting internal discussion at CMS Watch the past few days, surrounding the cost of storage for content technology systems. It is a discussion that too few buyers have before embarking on costly ECM or DAM projects. Enterprises often assume that storage, just like networking, is simply a corporate IT overhead -- and maybe in your organization it is -- but that doesn't mean it's free.

So how much does storage cost? Well, it's like the answer to the question, how long is a piece of string....."it depends." But a better answer may be "more than you might think." In many cases much more, and quite often more than the cost of the application software and associated services will cost you.

The variables here revolve around the type of content you manage. If it's mostly html files, then the volume will not likely be too high. If on the other hand, it is typically Office files, or worse still Rich Media (such as video files) then your storage needs will shoot up. These days Terabytes of data are the norm; typical ECM installations have 25-50TB of content, and some run into the multiple Petabytes. And these numbers are only likely to grow as rich media and ever richer documents become the norm.

Another factor that can seriously impact your storage costs surrounds the issue of lifecycle management, and version control. If redundant files are moved off the live system to an active archive as and when they become redundant -- and in turn are destroyed when they reach the end of their lifecycle -- your storage costs will become more manageable.

If on the other hand, you just keep everything, the costs will skyrocket. In previous consulting engagements for large enterprises I have found less than 4% of the content sitting in file servers to be relevant: the vast majority of the content is out of date, duplicated, or not even business related (porn, recipes family photo albums, and the like). Good content housekeeping is just common sense.

So how much does storage cost? Well depending on your media of choice, 25TB will cost you anywhere from just under a $100k per year for a hosted service to $350k if you were to buy, install and manage the hardware in-house. If you get into the Petabyte category then you are in the millions to start the discussion.

You of course need to look at all the storage options, and understand that storage costs are not just as simple as calculating disk space. There are trade offs in terms of access, performance, and price between Direct Attached, SAN, and NAS options. You will likely end up with a mixed environment. Just as you also need to consider the cost and need for Disaster Recovery, Back-Up, and Archiving in addition to your primary requirements.

Likewise you will need to consider the physical distance between consumers and stored data. Don't let anyone kid you -- a file accessed in Tokyo sitting in a file server in New York will take longer to render than one sitting in Kyoto...

And yes, just in case you are thinking about it: to back up one Petabyte, you will need another Petabyte at least....soon adds up doesn't it?

There is a myth that storage is cheap; it's not cheap. Just because you see 1GB flash drives going for a song at your local store, does not mean that enterprise storage costs have plummeted. They haven't. Moreover, they are not going to, since our needs remain insatiable and are only set to grow. So next time you are considering investing in a system like SharePoint, FileNet, Artesia, or MediaBin, make sure you think through your storage needs carefully, and cost them realistically from the outset.

By: Alan Pelz-Sharpe, Analyst, CMS Watch
Link to original article: http://www.cmswatch.com/Trends/1662-ECM-DAM-Storage

Tunicca Announces Clever Cost Savings Package

PRLog (Press Release)Aug 10, 2009 – Tunicca, the Business Process Analysis company has announced a new initiative – a constructive and manageable package to help companies in the present economic downturn. Tunicca whose core expertise is pre-media business analysis and improvement is offering an initial inexpensive package of consultancy and purchasing strategy for pre-media enterprises to help businesses streamline production and maximise profit margins. The package is tailored for all types of pre-media operations involved in the complex management of digital files.

Tunicca’s Sean Runchman explains, “Across a wide range of sectors such as Printing, Packaging, Publishing and Entertainment Media, businesses are facing financial challenges like never before. We are offering a low cost package to meet demand and assist pre-media companies in the current financial downturn.” “There is increased pressure to look internally for operational savings,” continues Runchman. “This is where Tunicca can assist. We are able to analyse a company’s production and supply chain model and identify and assess the bottlenecks. With our expertise we are then able to demonstrate solutions - in many cases maintaining existing technologies - which can result in cost savings and increased efficiency.”


Tunicca’s goal is to help companies to apply some measurement criteria to complex workflows and operations. This enables them to assess the performance of pre-media systems and Tunicca’s structured approach helps companies to then make savings. Tunicca provides a business based consulting service in order to deliver business transformation programs, market research projects, technology purchasing strategies and pursuit of best practice.

Tuesday, August 25, 2009

The Top 12 Options for Web Content Management

Gartner has published its Magic Quadrant for Web content management in 2009, to help CIOs and IT decide just what will meet the needs of the enterprise. Web software is now the fastest growing sector of the enterprise content management market, according to Gartner, and was valued at more $3.3 billion last year.

This annual report identifies the leaders in the industry. We've picked out the top dozen vendors aimed at enterprise content management on the Web. Here are their strengths and weaknesses, to let you get a handle on what's best for your business.

Leaders

While there's some overlap with generally popular offerings — Drupal-based solutions and open source are getting more attention than ever — most of the top dogs in ECM on the Web are specialist vendors who focus solely on the needs of enterprise. Gartner's choices for who fits inside the Magic Quadrant bear this out.

Oracle remains one of the biggest players in this area, despite being better known in many circles for its databases. What really brought them on the scene was the acquisition of Stellent in 2007, and since then their strength has been integrating WCM into their wide-ranging offerings for content management.

Autonomy, which entered the sector through its acquisition of Interwoven this year, tends to appeal most strongly to the marketing side of WCM. While they offer an ability to deliver content that's highly targeted, WCM will continue to be a sideline in terms of profit. While Autonomy can deliver right now, they lack a detailed roadmap for the future of the product.

Open Text is one of the most well-known in this space, and is the top pure play vendor for content management. Despite a close partnership with Microsoft, Garner predicts that Open Text's top competitor will continue to be SharePoint and other .NET software packages.

SDL acquired Tridion in 2007, and since then has shown impressive growth. This is the result of solid capabilities in multilingual and multichannel content management, as well as robust SharePoint integration.

Challengers

While the three companies listed at challengers by Gartner hardly seem like underdogs, software from Microsoft, IBM, and EMC are increasing in importance very rapidly.

Microsoft's SharePoint is growing enormously in market share as a Web content management solution. In addition to feeding off the trust that the Microsoft name inspires in just about every CIO, one of the strengths of SharePoint compared to current leaders is the partner ecosystem that is growing rapidly, and how tightly integrated it can be with Microsoft's other products in e-commerce, Web analytics and search. Of course, as fast as it grows in WCM, SharePoint is hated for its weaknesses as an intranet and document sharing system.

IBM's greatest strength in content management is also its greatest weakness. The fact that Lotus WCM is vertically-focused and is closely integrated with the entire Websphere Portal makes it appealing to organizations who already use the portal. But for those who don't, Lotus seems lacking without the rest of IBM's system.

EMC has been slowly brewing its WCM capabilities since it acquired the fairly popular Documentum in 2003. EMC is especially good in the sense that it works with a broader ECM solution, and can do DAM and records management of Web content too. It's also shown some of the biggest improvements in its latest release, 6.5, through adding technology first used in X-Hive, an XML database and dynamic delivery environment.

Visionaries

A list of honorable mentions shows up in the Visionaries section of the Magic Quadrant. The label might sound fanciful, but most of these up-and-coming vendors are making a real name for themselves.

Sitecore is a Denmark-based company that's also a Microsoft Gold Certified Partner. Their .NET CMS is unsurprisingly tied to Microsoft in a multitude of ways, which can be either a plus or a minus, depending on where your enterprise stands technologically.

FatWire Software has a Java software package that focuses on collaborative features and analytics, but suffers from what Gartner calls "costly" customization needs to make it play nice with related technologies.

Ektron is especially famous in the SMB market. Its CMS400.NET integrates relatively well with SharePoint Server.

Day Software sells software based on Java EE, and it's made strides in usability. Despite these improvements sales have lagged, and Gartner predicts that the partnerships with IBM and HP that this Swiss-based vendor has will decline in the future, weakening its position.

Clickability is a pure SaaS vendor in Web content management which is making progress with enterprises fed up with the costs of on-premise WCM, even if SaaS remains on shaky ground.

There are at least a dozen more vendors in enterprise-class Web content management who get short mentions in Gartner's report. Solutions such as Alfresco's open source software or Acquia's Drupal distribution might not warrant inclusion in any list of leaders yet, but they're making respectable gains. For a detailed account, be sure to read the full report.

Link to original article: http://www.readwriteweb.com

Friday, August 7, 2009

SeeFile Announces Version 4.7 DAM Software

Boston, MA – August 5, 2009 - SeeFile Software, the leading developer of web server software for sharing media files, is pleased to announce SeeFile 4.7. This new version offers an elegant web interface, an easy to use media bank and a delivery system accessible from any standard web browser. New development included backend changes, making page loads and image processing faster, optimizing the layout, and adding functionality. SeeFile 4.7 includes the following new features:
  • Optimized and easier-to-use user interface
  • Email notifications
  • Complete log history per user and action
  • Annotations on multi-page PDF files
  • Full support for current web browsers: Firefox 3.5, Safari 4 and Internet Explorer 8

The system monitors shared folders and volumes, tracking any changes in the file structure. When a change occurs, SeeFile instantly updates its database and web interface to synchronize with the file structure. Also, while Seefile is detecting changes it automatically renders previews and imports metadata. All of this means that SeeFile customers spend less time uploading files (since there is no uploading required) and less time getting files approved.

A different approach to digital asset management and delivery
Currently available digital asset management systems force users to change their workflow, adding more costs in training and in creating an optimized database. SeeFile offers a solution focused on small and midsize companies. Installing SeeFile takes only minutes, and the solution works like an FTP server (allowing creation of users with home folders) that will fit into nearly any workflow. The system offers all the standard tools available on digital asset management/delivery software. An administrator can create data fields that can be used to tag documents, all the fields, including the imported EXIF, IPTC and XMP metadata, are searchable. To respond to greater expectations from a Web 2.0 application, SeeFile offers collaboration features including annotations on selected areas of images and PDFs, and online approval. With SeeFile, companies involved with photography, print, video, brand management and advertising can archive their media files and manage several projects with their customers for a fraction of the price of other solutions.

Pricing and Availability
The software is priced from $499 to $4,995, depending on number of user licenses.


Reblog this post [with Zemanta]

Tuesday, August 4, 2009

Gartner survey claims software spending to rise in 2010

Link to original post: http://www.itp.net/news/563338-gartner-survey-claims-software-spending-to-rise-in-2010

By Nathan Statz on Thursday, July 30, 2009

Analyst firm, Gartner has released the results of a survey showing that 25% of respondents in the EMEA region expect their 2010 IT budget to increase.

As part of the research, Gartner surveyed approximately 1,000 IT professionals worldwide in April and May. When asked whether they expected their 2010 budgets to be below, the same or exceed the IT budget for this year, 25% of respondents from EMEA indicated they expected it to increase along with 30% from the Asia/Pacific, and 28% from North America.

"Software vendors should continue to build, fund and invest in software sales and marketing programs, even during tight market conditions to maintain customers and expand revenue opportunities," said Joanne Correia, managing vice president at Gartner.

The remainder means that 75% of respondents from EMEA indicated their IT budgets in 2010 would be the same or below 2009 levels.

“A market downturn is a disrupter that creates great marketing and sales opportunities for organisations prepared to take advantage of the right products, marketing programs and funding,” added Correia.

The survey follows on from another report released this month by the research firm, where a 6% decrease in global IT spending is predicted for 2009.

Gartner’s research also revealed that software budgets are expected to rise by 0.45% in 2010 in the EMEA region, compared to a global average of a 1.53% increase. The worldwide average is boosted by a predicted 2.54% rise in Latin America and 4.34% in the Asia/Pacific region. North America is expected to decline by 2.06%.



Reblog this post [with Zemanta]

ShareThis