Media Asset Management Market

It is forecasted to soar to $16.31 billion in 2029 at a compound annual growth rate (CAGR) of 22.6%

Read More @ https://goodprnews.com/media-asset-management-market-forecast/

#marketresearchreport #marketresearch #marketintelligence #marketreport #industryanalysis #TheBusinessResearchCompany #TBRC #MediaAssetManagement

Media Asset Management Market Forecast: Exponential Growth Expected from $6.04 Billion in 2024 to $16.31 Billion in 2029 - Good PR News

How Will the Media Asset Management Market Size Fare in the Coming Years? The media asset management market size has seen significant growth in recent years. Data analyst projections state that it will surge from $6.04 billion in 2024 to $7.22 billion in 2025 at a compound annual growth rate (CAGR) of 19.5%. The market

Good PR News - Global Free Press Release Submissions Site

Reading Time: 4 minutes

When looking at hard disk options I noticed that with the Seagate One Touch Hub they offer six months with Mylio so I decided to try the app, without buying that app. My first thought is that it claims to replace Google Photos and iCloud and yet the cost is similar per month. If you compare it to self-hosting solutions like Immich, PhotoPrism and NextCloud then it’s not that interesting, especially since it is for windows and mac but not Linux.

The Downside

After a few minutes of playing with the app on Windows, two iPhones and a Mac Book Pro I find the app confusing to use. It doesn’t align with what I expect. We can tell it where to find photos and tell it to keep them there, but I don’t see an option to tell it to use an external drive as the storage space. I also find navigation confusing. It’s not as intuitive as I would like.

The Upside

The best feature that I have seen with this app is the ability to download photos from instagram and Facebook but also frame.io and flickr. I especially want to be able to download my photos from Flickr because I have decades of memories, especially from 2005 onwards which I want to verify that I have saved elsewhere before dumping the service. Once I download the photos I can dump the paid account and forget about flickr.

For this option to work I need to be able to tell it to copy files to an external drive as I have gigabytes of files to rescue, dedupe and store.

Photo Dedupe

As you take photos and export them to a hard drive or laptop you sometimes keep them on the card as a backup, in case the laptop or other device fails. The issue is that you may import the same photo three or more times. If you organise your files by year-month-day then it’s easy to spot duplicates and delete them.

Mylio, Immich and PhotoPrism all have duplicate detection. PhotoPrism detects them when ingesting. I think Mylio detects them later, when you run the app.

Similar to Immich and PhotoPrism

Some of the features are similar to PhotoPrism and Immich. It has facial recognition, tagging, sorting by various EXIF data and more. It also allows you to sync between your phone, ipad and laptop as well as mark which images to keep private for you, semi-private for a group of friends, or public.

It also provides you with Spaces. Spaces are photo galleries where you can choose to sort photos by activity, or group. You could have a hiking space, a diving space, a via ferrata space and more. You could also have a space for a conference, or for photos with university friends, and more.

Cost Comparison

Flickr is 80 dollars per year for unlimited storage. iCloud is 120 dollars per year for 2 terabytes. Infomaniak kDrive is 67 CHF per year for two terabytes. Mylio is 99 USD per year for unlimited storage but the question is how many gigabytes or terabytes of storage do you need. I’m not a light user and I don’t hit the two terabytes of photos limit. I do hit that limit with videos.

The second question is whether you use this as your offsite backup, or your primary backup. If you use it as your primary backup then you are trapped, until you can export your files.

And Finally

In writing this article I looked around and found that in order to use an external drive you need to commit to a subscription. You can test it for free for thirty days, but you need to pay to backup your own photos to your own external drives and this is a fatal flaw, for me. I am happy to pay for apps, with a one off. I paid for Final Cut Pro X a few years ago, and most recently I paid to have all features of the PhotoSync app. I was fiddling with the app one day and saw a promotion to get lifetime premium for 15 CHF and took it. Now I can sync my phone to plenty of apps automatically, including PhotoPrism.

More than a decade ago I paid about 1600 CHF for Final Cut Studo because I wanted to own it, for professional projects, after using university copies during my studies. I refuse to pay for Adobe Creative Cloud and other monthly subscription plans because I want to own an app, and use it when I need it. I don’t want to pay monthly for an app that I may not use for weeks or months at a time.

Adobe Creative Cloud costs 40 CHF per month and 480 per year. FCP X cost me 300 CHF and I have owned it for years. Every four years you would have paid for Final Cut Studio with the subscription model.

Back on Topic

The reason for using PhotoPrism, Immich and NextCloud is to stop paying monthly fees. Mylio goes the other way. “Backup your phone and your laptop, but once the internal hard drives are too large start paying 99 USD per year, or 149 USD every two years. In theory it’s unlimited, but normal people probably don’t have eight terabytes of photos, they have as many as fit on their phones, and not much more.

Conclusion

Carbon Copy Clone charges 44 CHF for the use of their product. OpenAudible charges about 30 CHF for the use of theirs. PhotoSync costs about 21 CHF once for all functionality. Mylio costs 90 USD to sync your files to your own external hard drive. That functionality should be 20 CHF, once, I do not want to be charged a monthly fee to use my own hardware.

https://www.main-vision.com/richard/blog/very-quick-thoughts-on-mylio/

#album #mediaAssetManagement #photos #synched

Very Quick Thoughts on Mylio

When looking at hard disk options I noticed that with the Seagate One Touch Hub they offer six months with Mylio so I decided to try the app, without buying that app. My first thought is that it cl…

Richard's blog

Reading Time: 3 minutes

For two weeks I have been sorting through terabytes of data and it has been a journey through time. It’s easy to collect data and every so often when the laptop is full, move that data to a hard drive until that drive is full, and then onto the next, and the next, until you have a drive or two per year, for several years.

What makes this interesting is that these drives have dmg files, iso images and more. They also have fitness tracker files and dive logs, and versions of your website as it changed over the years, and more. It reminds you of when you used Adobe Air and a Nokia 95 8gb and more. It also gives you access to files you had completely forgotten about but are happy to find. It’s photos and videos that have value.

Aperture and iPhoto Libraries

I tried to move Aperture and iPhoto libraries but they are very annoying to move because they contain tens of thousands of files. They contain thumbnails, preview files, a complicated folder structure and general chaos. When you move a photo app gallery from one drive to another it’s very slow because of all the files within. I usually choose “show package contents” go to “master” folder, and move the photo albums by year to an external folder. Transferring galleries this way is much faster. It takes the time to move data, rather than the time to recreate the complex folder structure.

Old Versions of Linux

In the process of clearing drive space I found that I had old versions of Linux on several drives. Some of those might have been good for 32bit challenges. I don’t remember if I kept any, or deleted them whilst trying to reduce the amount of gigabytes to transfer.

Pruning Files

Each one hundred gigabytes I transfer takes about an hour, so if I can delete several hundred gigabytes of files I will save several hours of sitting and waiting for files to transfer. I’m writing this blog post as I move 278 gigabytes in about two hours. If it says “more than two hours” it means that it will take almost three hours. Moving terabytes of data between drives takes a lot of time, especially with older drives, especially if the file structure is complex.

Diving into Final Cut Pro Packages

Final Cut Pro has package files. Within these package files you will find render folders, transcode folders, and other files. You can safely remove render files, but with originals and transcode files I would double check that those files are still available somewhere else before deleting them. Transcode files, render files, and preview files are regenerated when you open a project so you can remove them to save space once a project is finished.

For some reason I went into the FCP package file and moved the folder structure to outside of Final Cut Pro. It took me a while to realise this. As soon as I did I copied the folder back into the package file and when I established that all the files were duplicated I deleted the duplicate files and saved several hundred gigabytes. I also saved several hours of my time for transferring. I also saved time that I would otherwise have spent sorting. I still have 1.8 terabytes to move, so that’s another 16 hours or so to go.

And Finally

When consolidating files from several smaller drives to a large central volume the most time consuming part is the time that it takes to move data from old drives to newer drives. It’s at least an hour per hundred gigabytes so ten hours per terabyte. The one good thing is that you can start the transfer and forget about it until you notice a message about duplicates, or other.

This is time consuming but eventually I will have well organised files and I will be able to add the volume or two to photoprism and photoprism will index everything. It will be worthwhile in the end.

https://www.main-vision.com/richard/blog/journey-through-time/

#finalCutPro #mediaAssetManagement #memories

Journey Through Time

For two weeks I have been sorting through terabytes of data and it has been a journey through time. It’s easy to collect data and every so often when the laptop is full, move that data to a h…

Richard's blog

Reading Time: 4 minutes

One of the easiest things to do is to buy a hard drive, and over a period of time fill it, and then get another drive, and a third, and a fourth, and a fifth. People will go from a small drive, to a larger drive, and a larger drive after that and backup plenty of the files across three to five drives. The result is that you have terabytes of storage, and some files are “backed up” on every drive, whereas others are precariously stored on just one drive. The result is that if you do want to consolidate that data to a single volume you need a larger and larger drive.

Over the years I have worked as a Media Asset Manager and I have a technique for finding duplicates, and freeing HD space, whilst at the same time ensuring that data is still backed up to at least two volumes. if Not more. As I deal with video I do this by organising every project by year-month-day-country-subject-individual where “individual” is the person working on that project.

Consolidate Media Assets onto a Single Volume

The idea is that as you copy files and folders from a collection of external hard drives you consolidate everything onto a NAS or other form of storage server. If the date is not provided I use the file names and project title and related information to situate it in with time.

Iterate

At first the process is easy, because every project and set of files are unique. The beauty of the year-month-day folder structure is that as you progress you spot that two or three folders have the same name, and that’s when you go and check file dates to see which is the most recent.

Fine tuning

One nuance is that if a file is named final final, and another is named I hope this is the final, and a third is named final final final final, then you change the name. Each file is renamed according to it’s creation date. This allows you to see within seconds which file was last exported, for example, and helps order chaos.

Backing up the Consolidated Files

In an ideal world you would have as many terabytes or petabytes of storage for the NAS but this isn’t always possible. What I did instead was to consolidate the data from the smaller external drives onto the central storage and then store them, as they were, as a backup in case the NAS fails. This isn’t ideal but it’s a good compromise when budget is limited.

Imagine that you have three or four one terabyte drives and they’re all getting full. If you have a two or three terabyte file you may be able to backup one or two drives, but run out of space for the two others. The solution is to get a four or five terabyte hard disk. With this you can dump the four smaller drives, and then start sorting projects and media assets by year-month-day but also by project name. With this work flow you can identify duplicates with ease.

Reducing the Number of Drives Plugged In

The aim of dumping drive A, B, C and D to drive E is that you get rid of the need to have four drives plugged in at once. You can have a projects folder for 2024. Each drive is in its own folder. You open two finder windows and then you can start organising your projects by year-month-day-project-name. You move the folder for the project from the Drive A folder to the year folder. When you finish bringing in drive A you repeat the process with drive B, Drive C and Drive D. When you spot duplicates you delete the duplicate files.

Removing Duplicatew

As you progress you go from having four terabytes of files to having three terabyte, down to two and a half, and so on. As duplicates are eliminated you regain that space so your four terabyte drive is not filled instantly. Ideally you would have the four terabyte drive mirrored on another four terabyte drive. If space is tight then you can re-use the small drives. 2024 can then be split across two or three drives. Now you have a primary storage solution, and several secondary drives. If the four terabyte drive fails you either fail over to the clone, or you fail over to the re-used smaller capacity drives.

If the four terabyte volume fails then you get a new four terabyte drive, and copy the data from the four backup drives and within a short amount of time you have recovered.

Shifting to Exfat

For a long time I was using MacOS so I had drives that were formatted with APFS or the Journaled versions. As I shifted from MacOS to Windows and Linux I found the need to have drives that could be read by all three systems. By consolidating media I gave myself the room I needed to backup a drive from one volume to another, convert it to exfat, and then move data back.

If the windows machine fails, or the Linux machine fails, or the Windows machine fails, then I do not want to have to replace that machine, just because of a file system. By having Exfat I have the freedom to slide from machine to machine, with fluidity.

And Finally

It can be overwhelming to see that you have ten to twenty hard drives that may contain duplicates. By getting a larger drive you can go from having a dozen drives, and a dozen places where things are stored to having a single place where everything is centralised. Once everything is centralised you can order files in folders that are organised by year, and then by year-month-day. In so doing you don’t need to know what you’re archiving intimately. You just need to be able to organise things by year-month-day.

I know that I repeat this point a lot but there is a reason. If you know when something was photographed, filmed or worked on then you can find files within seconds, rather than hours, and without the CMS, should the CMS fail. The other reason is that a project might have two or three names, depending on who worked on the project. If it’s organised well, then going upstream is easier.

This should be an itterative process. Start by the task that is easiest, and then itterate and fine tune the files and folders until everything is well organised, duplicates are detected and consolidated, or deleted, dependant on context. If you do this well then it can be relaxing.

I’ve been consolidating files and data to free up drives to experiment with Immich, PhotoPrism and Nextcloud in parallel.

https://www.main-vision.com/richard/blog/playing-with-hard-drives/

#hardDrives #mediaAssetManagement #organisation #storageSpace

Playing with Hard Drives

One of the easiest things to do is to buy a hard drive, and over a period of time fill it, and then get another drive, and a third, and a fourth, and a fifth. People will go from a small drive, to …

Richard's blog

Reading Time: 4 minutes

While playing with Nextcloud I found a serious flaw. If you add images via the command line from one directory to another, and then delete them, then their ghosts remain in the timeline. By ghosts I mean references to these files in the CMS and there is no quick way of removing them. You need to remove them individually and that’s time consuming. That’s why, when I was trying to find a solution I came across Photoprism.

Photoprism is a photo management app like Lightroom, Google Photos, iPhoto and plenty of other app. It is open source and can be installed quite easily using this Raspberry Pi Solution. With less trial and error than with Nextcloud I was able to get it up and running within an hour or so.

My first try was wwith a slow 16gb card but that took ages so I thought that I would set another card, this time with 512 gigabytes of storage going overnight. In the time it took me to heat dinner the card was ready for me to experiment with and my first impressions were good.

I tried to upload a few images from a PC, and then from the mobile phone via a web browser before playing with Photosync.The beauty of Photosync is that it will upload even when the phone is sleeping, or another app is open. I let it synchronise photos while I slept, and as I went shopping for bread for a Fondue later today. The moment it seemed to lose momentum is when I got back to the parking after shopping. That’s where I lose the mobile phone signal. As I wrote this, after fourteen or so hours of working almost non stop the files were synced between the phone and Raspberry Pi 4 2gb. They recommend using a Raspberry Pi 4gb or higher but for the sake of tests it seems okay with a lower spec machine.

Indexing

With this app you need to tell it to index photographs. This doesn’t happen automatically.

It can recognise people, create labels/keywords for images, moments, places and more. It also gives you a log of everything it’s doing, from indexing photos to adding locations, to adding keywords, to asking you to name faces that it recognises. Remember that this sits on your personal device, and does not need to ever touch the cloud, if you do not want it to.

Another feature that this app has is to detect and flag low quality and low resolution images and this can be a very good feature to have. Sometimes you get junk images from old websites or other directories. This makes it quick to get rid of them.

When Photoprism indexes photos it creates a seperate file with the new file names, creates thumbnails of various sizes and then when it completes its task, or when you stop indexing, it then merges the old index with the new index.

You have the option of a “complete rescan” which reindexes everything or you can choose to “cleanup” which deletes orphaned index entries, sidecar files and thumbnails. It’s because Nextcloud doesn’t have an intuitive way of re-indexing files that no longer exist that I was tempted to try this piece of software.

If I was to change two things with indexing then I would add a status indicator to tell me how many images remain without an index. It runs fine in the background but it would be nice to know how many images remain. I saw that someone else said that they wish indexing would run automaticall, until all images are indexed, and then again when new images are added.

EXIF Data

The app allows you to see image exif data, for example latitude and Longitude but also a title based on the location, iso, exposure, camera used, lens, f-stop, focal length, copyright, subject, description and keywords.

For key wording it uses colours, location, keywords in the local language, such as “Lac”, “rue”, state, country and more. You can add notes and click done if you change anything.

Humour

It has labelled a cat as a dog, human climbers as lizards, a sign for road works as a monument the LHC tunnels as wood.

Filtering

You can filter photos by countries, cameras, newest, month, category, colour, year and more. This makes searching for images quick and intuitive. I also like that it automatically keywords images with the most obvious tags. This allows the human being who is sorting these images to add specific tags, such as people involved, event keywords and more.

Video and Photos

With Nextcloud when you upload videos it doesn’t recognise them immediately so you get a grey box. With Photoprism you see a keyframe and you can then watch the videos within seconds so this tool can be used for photographs and video.

Downsides

Unless you pay 2 USD per month you only have one user, so you can’t have admin as just the admin, and use your own name as a user. This is sub-optimal for security but also for family sharing.

Photosync is also not free. It wants you to pay 6 CHF for the app, and encourages you to pay for Photosync Premium.

If you pay for the bonus features you will pay 6 CHF for the app, 25 CHF for lifetime Premium, and another 2 CHF per month for the right to create more users on your own system.

Google Photo is 100 CHF per year. iCloud is 120 CHF per year. Lightroom is 10 CHF per year to 55 CHF per year.

A few years ago I used Kyno by LessPain Sotware and that is 159 USD per year.

It’s cheap compared to the competition and by using it you’re supporting a European product, rather than American.

And Finally

Whilst Nextcloud is great for file sharing, time tracking, tasks, news reading and more Photoprism is great for managing photographs. It is quick and easy to install on a Pi. You find the URL, you tell Etcher to burn it to an SD card, you put the SD card into a Raspberry Pi and within minutes with a fast card, you can connect either by SSH or by web interface. Within minutes you can be using the Raspberry Pi as a photo management tool locally.

If you install tailscale and Photosync you can be backing up your mobile device images within a matter of minutes and it remembers what has been synchronised, whether you use the local IP address or the tailscail VPN one. When you’re synchronising thousands of files you want a solution that remembers what you have synched.

I was so convinced by Photoprism that I considered replacing Nextcloud with it in the 8gb Raspberry Pi, but chose not to, for now, because Nextcloud has time tracking options that I want to experiment with, for now.

https://www.main-vision.com/richard/blog/experimenting-with-the-photoprism-app/

#day406 #googlePhotos #icloud #mediaAssetManagement #photoGallery #photography #photoprism

PhotoPrism - SD Card Image

Official Documentation

Reading Time: 4 minutes

Setting up a drive to be available via Samba is a relatively simple thing to do. The drawback is that you have files that are as organised as the media asset manager. It can be quite chaotic unless you have someone trained as a media asset manager, archivist, or other, to help order photos videos and more. To some degree Nextcloud is just as disorganised, initially.

I have spent more than five minutes experimenting with Nextcloud through several iterations and I have finally set things up as I want them. I have Nextcloud running on a Raspberry Pi 8GB. I chose this device because it’s the highest spec pi available at the moment without months of waiting. I could have used an HP Elite Book from several years ago but I want a machine that can be on permanently.

The first sync is slow. Twenty hours ago I started with over 19,000 photographs and videos and now I still have 6400 remaining. I activated recognize, an AI solution that recognises music genre, objects, human movement in video, people, bodies of water and more. I also have a tool running that maps photographs to show where they were taken.

The beauty of Nextcloud for photo storage is that it allows you to sync from your phone via the app, but it also allows you to upload photos via a web interface, or if you’re so inclined via file transfers on the back end. I have yet to test the latter. The idea is simple. If you have terabyte drives filled with photos that you have already organised by year, month, day, location, and topic, then that file structure should be recognised by Nextcloud. The work that you have done to organise media assets is already done. It’s just a matter of letting Nextcloud see them, and it will take care of mapping, and recognising objects, monuments, images with people and more.

Facial Recognition

With time it recognises faces and the faces are just given a number. You can then provide them with a name. I added my name to the collection of photos of me. It needs 120 faces before it starts to recognise individuals. As the model is self-hosted this data stays local to your system

Object and Landscape Recognition

I noticed that it recognises water, alpine landscapes, signs, boat, bridge, flower, furniture, historic, information and more. The Pi is still working hard to ingest the remaining 5600 photos but when that is done it will have plenty of time to recognise what is in pictures.

When you work as a media asset manager it takes time to tag images, and to add location data. If AI can provide some of this information automatically then it saves a lot of human time. Time that humans can spend adding images to the right folders.

Folder Structure

As a best practice you should always use folders with year-date-country-event-name-photographer-initials. If Nextcloud is up and running you can rely on Nextcloud but if for some reason Nextcloud crashes, or you can’t use a web browser or app, you want to be able to find things according to year, month date, photographer, and topic. Nextcloud should be an embellishment but good Media Asset Management practices should be prioritised.

To some degree the iOS app can help with this, as long as you set things up properly ahead of ingesting all the photographs. I haven’t seen how to set it up yet, but for now I’m still testing the proof of concept, for mobile phone image backup.

Using an Intel Machine

What I am doing with the Pi is experimenting with a Google Photo and iCloud replacement. What I plan to do with the linux laptop is use the full power of a normal computer to serve as a media asset manager for when the machine can be turned off, and on, when not in use. The aim of Nextcloud on the laptop will be to provide me with a one terabyte NAS where I can experiment with what Nextcloud really has to offer.

Tensorflow WASM mode
WASM mode was activated automatically, because your machine does not support native TensorFlow operation:
Your server does not support AVX instructions
Your server does not have an x86 64-bit CPU

When you use the Pi it does not have the required the required x86 64-bit cpu. For that I need to use the Intel machine. It also has GPU acceleration, which I cannot use on the Pi. The Pi is good because it can be on 24 hours a day, as a quick backup tool for your phone, but an Intel NUC machine can be a Nextcloud server with the required hardware to do things much faster.

And Finally

iCloud and Google Photos are great for backing up when you’re out and about. They’re less great when you want to recover your photos. This is because if you remove photos from iCloud they are removed from everywhere so it’s dangerous to clear photos to make space.

With Google Photos the issue is that their cloud backup solution is 34 CHF more than Infomaniak’s cloud storage solution. It is for this reason that I wanted to have a local backup of my Google Photos and iCloud photos. When I setup the Intel machine and ensure that all my photos are backed up from Google Photos I will be able to purge Google photos and downgrade my Google One plan.

My aim is not to eliminate Google Photos, but to reduce the plan I’m using. I have access to two terabytes but I never use it, and Infomaniak is cheaper, so I prefer to have a single plan. The Intel will be the main backup, and kdrive would be the offsite backup.

In the time it took to write this blog post I went from 6400 images to backup, down to 3800.

https://www.main-vision.com/richard/blog/playing-with-nextcloud-continued/

#day403 #experimentation #learning #mediaAssetManagement #NextCloud #photoManagement #raspberryPi

Playing with Nextcloud Continued

Setting up a drive to be available via Samba is a relatively simple thing to do. The drawback is that you have files that are as organised as the media asset manager. It can be quite chaotic unless…

Richard's blog