Product & news
How it all got started
by George Dochev,
Co-founder & CTO, LucidLink
April 2024
4 mins

Table of contents
Your files, faster.
Access any file instantly, anywhere. Collaborate in real-time from one always-up-to-date, secure cloud filespace.
A globally distributed team, slowed by distance
Back in 2012, I was working remotely from my home in France, while most of my team was in two offices: our main office in Florida and a branch in Bulgaria. We also had a couple of engineers around the globe working from home, like myself.
We were building the next-generation storage virtualization software at DataCore, called SANsymphony. The source code for the product was already quite large consisting of tens of thousands of files and building it required dedicated hardware.
We had set up an automated build system located in Florida, which would kick off a build after each commit into the source code repository. Each build was around 10GBs, and we had more than 10 builds per day, resulting in a large volume of data produced daily.
All of our team members needed access to those builds for testing and debugging. The bulk of our test team was in Florida. Often, as part of their testing, they generated dump files which had to be analyzed by developers in other countries. These files were also in the gigabyte range and accumulated quickly.
The remote file sharing problem
The two main offices were connected via a VPN tunnel, so the Bulgarian team could see the same file shares as the Florida team.
Unfortunately, due to the large distance, the remote file shares proved excruciatingly slow. Often, hours would be wasted waiting for the next crash dump to get copied over before troubleshooting could begin.
We had a guaranteed bandwidth network connection, which in theory would give us good transfer rates, yet in practice we were getting only 1/20th of the available bandwidth. After testing, we came to the realization that the primary reason for the slow access was the inefficient network file protocols. They worked well on the local network, but performed very poorly over long distances.
To alleviate some of the pain, we wrote our own tool to synchronize the builds produced in Florida to a local server in Bulgaria, so the team there would have local access. While this scheme worked ok for the branch office in Bulgaria, the rest of us working from home didn’t have the dedicated server or the network bandwidth to do full synchronization. We still had to rely on painfully slow vpn/file share access.
Add to that the fact that some of us were using Macs — which had their own interoperability issues with Windows file shares — and you get the picture. Every 3 to 4 weeks, I would rotate between my home in France, our office in Bulgaria and the one in Florida. As I worked between all 3 locations, I noticed how much less productive we became the further we were from the main office.
The idea is born: a new approach to file access
That's when I thought, wouldn’t it be great if there was something like a distributed file system for the Internet? A system that would span all locations, so the whole team can see one global namespace and access its files, irrespective of where they were.
It wouldn’t require VPN. It would be designed to work efficiently over the Internet by maximizing the available network bandwidth and reducing the chattiness. I wouldn’t have to synchronize the whole file system, like the existing file syncing services. Instead, I would directly access the remote files just like on a network share.
So, instead of copying a several-gigabyte crash dump file, only to discard it when I am done, I would instantly open it in the debugger and start the analysis. I could also selectively sync certain folders, so the branch office could maintain replicas of the builds. And, I could pin down a particular build that I used more often, so it would be available locally on my laptop — even when offline.
This Internet file system would work equally well on all operating systems, not just Windows, and seamlessly switch between online (connected) and offline (disconnected) mode. It wouldn’t require complex configuration from IT and it would work on any desktop or mobile device. It would be fast and easy to use, integrated so seamlessly with the OS that’s it’s practically invisible to the end-user.
Wouldn’t that be awesome? And so the idea was born.
A hard problem worth solving
I was sure many distributed teams had similar needs, but surprisingly, there wasn’t anything like that on the market. I started researching and quickly came to the realization that while it all sounded pretty good on paper, creating something that works reliably would be daunting.
It was a classical distributed system with lots of moving parts running in a heterogeneous environment with devices constantly appearing and disappearing from the network. Hum… I was starting to see why no one had tackled this successfully.
But it was a formidable problem that would bring significant value once solved.
I was game.
Keep reading

Collaboration
Cloud storage
Product & news
File streaming vs edge filers: eliminating the middleman
Discover where edge filers fall short for collaboration and how file streaming gives teams instant, direct access to cloud data from anywhere.
05 January 2026, 6 mins read

Product & news
LucidLink December update: wrapping up 2025 with faster, more reliable workflows
We’re wrapping up the year with updates across desktop, web and mobile to make your workflows faster, more reliable and easier to manage.
17 December 2025, 4 mins read

Product & news
Collaboration
Cloud storage
File streaming vs file acceleration: stop moving files
Discover where file acceleration solutions fall short for collaboration and how file streaming gives teams instant, direct access to cloud data from anywhere.
11 December 2025, 7 mins read
Join our newsletter
Get all our latest news and creative tips
Want the details? Read our Privacy Policy. Not loving our emails?
Unsubscribe anytime or drop us a note at support@lucidlink.com.