Random Ramblings on LabVIEW Design

Community Browser
Labels
cancel
Showing results for 
Search instead for 
Did you mean: 

Re: Disaster - Public Service Announcement

swatts
Active Participant

We use www.codespaces.com to host our Subversion repositorys and bought into the cloud-based business model.

They use Amazon cloud services to host their system and the selling point was secure off-site back-up every 10 minutes. This all seemed perfect for us and it really improved how we do business.

Imagine my horror then to being informed that they have been hacked into closure with all back-ups deleted.

This gives me a suspicion about cloud-based businesses. These are essentially small, low overhead companies holding liability for vast amounts of data. If anything goes wrong they immediately shut up shop, leaving the poor customer up the creak.

So now I have to find an alternative repository/bug reporting/project management tool and change all my documentation.

What a nightmare.

Love

Steve

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

Comments
Thoric
Trusted Enthusiast

OMG, this is terrible news. I hope you didn't lose much/anything Steve? If there's anything I can do to help just let me know.

Thoric (CLA, CLED, CTD and LabVIEW Champion)


swatts
Active Participant

At the moment I'm flapping, in the cold light of day we have back-ups. Generally we don't have any systems that need undoing.

It's a real shame as it was a great product, but there is a fundamental weakness here that I need to fully think out. What's the point of backing up, having customer passwords etc if some spotty herbert can get into the Amazon EC2 control panel and delete the lot.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

FabiolaDelaCueva
Active Participant

Hi Steve,

I feel your pain, in fact I started panicking and immediately went on a search of what my SCC provider, Bitbucket does:

https://confluence.atlassian.com/pages/viewpage.action?pageId=288658413

It turns out they have on site, off site and even tape backups. That made me feel a lot better. Also, this might be a good opportunity to migrate your projects to Distributed Source Code Control (Mercurial or Git are good options). Because as the guys at Atlassian say:

"One of the great features of distributed version control systems (DVCS) such as Git and Mercurial is that they're distributed. Backups are still very important, but consider that every person on your team that has forked your repository has a complete copy with a full history."

I recently migrated some SVN repositories to Hg and Bitbucket had a wizard to make the process painless.

Good luck and thanks for sharing your story. Again, I am so sorry this happened to you

Regards,

Fab

For an opportunity to learn from experienced developers / entrepeneurs (Steve, Joerg, and Brian amongst them):
Check out DSH Pragmatic Software Development Workshop!

DQMH Lead Architect * DQMH Trusted Advisor * Certified LabVIEW Architect * Certified LabVIEW Embedded Developer * Certified Professional Instructor * LabVIEW Champion * Code Janitor

Have you been nice to future you?
Thoric
Trusted Enthusiast

We're toying with Git and JIRA and STASH here at the moment, based on some recommendations. I've tried distributed SCC through Mercurial and it was a better experience than SubVersion. Hopefully Git will also prove to be as good.

JIRA is a project management tool that costs about $30 per year for 10 seats, allowing you to manage your requirements, bugs and knits into STASH, which works with your code releases and version control (so I understand - not had a chance to try it just yet).

Very cheap, quite capable (on paper).

Thoric (CLA, CLED, CTD and LabVIEW Champion)


FabiolaDelaCueva
Active Participant

Thoric,

At Delacor we use JIRA too, it is very powerful. You can also link your commits to bitbucket with specific JIRA cases. Let me know when you try it if you need any help.

Personally I have liked Mercurial a lot better than Git, but I think it has to do more with the tools available to interact with it. I like TortoiseHg for Mercurial and I have started to use SourceTree (also from Atlassian) for Git, SourceTree can also be used for Hg and I believe even for SVN.

Regards,

Fab

For an opportunity to learn from experienced developers / entrepeneurs (Steve, Joerg, and Brian amongst them):
Check out DSH Pragmatic Software Development Workshop!

DQMH Lead Architect * DQMH Trusted Advisor * Certified LabVIEW Architect * Certified LabVIEW Embedded Developer * Certified Professional Instructor * LabVIEW Champion * Code Janitor

Have you been nice to future you?
justACS
Active Participant

Good god, man!  My condolences.

swatts
Active Participant

FabiolaDelaCueva wrote:


                       

Hi Steve,

I feel your pain, in fact I started panicking and immediately went on a search of what my SCC provider, Bitbucket does:

https://confluence.atlassian.com/pages/viewpage.action?pageId=288658413

It turns out they have on site, off site and even tape backups.


                   

Word of caution.. codespaces pretty much assured us of the same. Backups every 10 mins, your data is safe with us........blah blah blah. The thing I would look for now, using hindsight, is the ability to batch back-up all repositories to your own system

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

FabiolaDelaCueva
Active Participant

swatts wrote:


                       

FabiolaDelaCueva wrote:


                       

Hi Steve,

I feel your pain, in fact I started panicking and immediately went on a search of what my SCC provider, Bitbucket does:

https://confluence.atlassian.com/pages/viewpage.action?pageId=288658413

It turns out they have on site, off site and even tape backups.


                   

Word of caution.. codespaces pretty much assured us of the same. Backups every 10 mins, your data is safe with us........blah blah blah. The thing I would look for now, using hindsight, is the ability to batch back-up all repositories to your own system


                   

This is not a problem, because we are using distributed Souce Code Control so all the developers at all times have the entire repository plus we do a backup off our customers' VMs.  It is more of an issue with SVN repositories, but slowly we are migrating away from those.

For an opportunity to learn from experienced developers / entrepeneurs (Steve, Joerg, and Brian amongst them):
Check out DSH Pragmatic Software Development Workshop!

DQMH Lead Architect * DQMH Trusted Advisor * Certified LabVIEW Architect * Certified LabVIEW Embedded Developer * Certified Professional Instructor * LabVIEW Champion * Code Janitor

Have you been nice to future you?
ohiofudu
Member

Sorry Steve,

always use the old school style" External Drive for local BackUps"

Munch Love.

Certified LabVIEW Architect
Certified TestStand Architect
swatts
Active Participant

FabiolaDelaCueva wrote                

This is not a problem, because we are using distributed Souce Code Control so all the developers at all times have the entire repository plus we do a backup off our customers' VMs.  It is more of an issue with SVN repositories, but slowly we are migrating away from those.


                   

I like the idea of that, I must admit. I need to be convinced that git/mercurial sits well with multi-developer large projects, I'm suspicous about merging etc. This is very likely just ignorance on my part.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

AristosQueue (NI)
NI Employee (retired)

And people wonder why I refuse to upgrade my Adobe Photoshop/Illustrator tools to the new Creative Suite where all files are hosted in the cloud! My design team -- who sits right next to me at work -- is having a nightmare right now because the newest edition of the online Creative Suite is sometimes not saving files to the cloud even though it claims to be doing so. With a disk, I can look at it and see if the timestamp and file size actually changed.

I really do not trust cloud systems. I think we should all have a server at our homes for our data and have better tools for creating drives that I can make available on any system that I happen to be working on. My files stay on my machines and not on anyone else's machines, but they are just as available to me as they would be on one of these cloud services.

I really hope I'm not being overly luddite about all of this, but frankly, the security and legal guarantees just aren't there... none of these companies is willing to accept legal liability for the security and recoverablity of data.

PS : What the HELL were the backups doing available through a network connection?!!?!?! Those should have been saved to physical media and moved offline once per week at the LATEST.

AristosQueue (NI)
NI Employee (retired)

swatts wrote:

I like the idea of that, I must admit. I need to be convinced that git/mercurial sits well with multi-developer large projects, I'm suspicous about merging etc. This is very likely just ignorance on my part.


                   

I am unconvinced but trying to become convinced as NI is now using github for our new community sourcing experiment. So far, using git has been nothing but an exercise in hate for me. SVN, TFS, Perforce... all of these are much better tools, IMHO. But they all lack the distributability of Git, so I'm kind of stuck with Git.

Daklu
Active Participant

I agree with Mercer.  I tend to not trust systems that fully rely on the cloud.

Several months ago I purchased a Synology Diskstation and use that as my main storage repository.  There is a lot more stuff to learn about, but I feel better knowing I am in control of my files.

Daklu
Active Participant

AristosQueue wrote:

I am unconvinced but trying to become convinced as NI is now using github for our new community sourcing experiment. So far, using git has been nothing but an exercise in hate for me. SVN, TFS, Perforce... all of these are much better tools, IMHO. But they all lack the distributability of Git, so I'm kind of stuck with Git.


                   

Is it the lack of good Windows tools for Git or the transition from central repository to distributed repository that is causing the hate?

James_McN
Active Participant

Wow, I was very seriously considering using this for my next project and was just reading about their upgrades coming.

I'll be interested to hear what you go to next, especially if you have things to say about Jira. I'm currently using Fogbugz/Kiln but I'm not convinced that Jira might offer a lot as well as the Fogbugz interface isn't quite as revolutionary as they make out.

As for Git/Hg, in my experience once you've started you wont look back! I find SVN feels quite clunky by comparison.

James Mc
========
CLA and cRIO Fanatic
My writings on LabVIEW Development are at devs.wiresmithtech.com
swatts
Active Participant

It's a real shame as it was pretty much perfect for us. The blurb on all the hosting sites is pretty much the same (back-ups in different continents, offsite etc etc), this leads me to suspect that they are all hosted in a similar fashion. The only thing really missing was an open/easy access to our own data, by this I mean a button allowing me to download our own repo and back it up ourselves. Even better would be ftp access to the repo directory and a complete batch back-up.

I have a feeling that git/mercurial are more suited to branch and merge type projects and I'm not sure this fits into our pattern of doing things. 500 VI project, small change of a type-def affecting 40 VIs massive = merge headache. I could be wrong here, it's just a feeling with no evidence to support it.

I'm investing in a virtual linux server and going to host my own repositories, ftp, and eventually bug tracking and project management. It's a big effort but at least I will be in control of it.

TRAC looks like a good tool http://www.turnkeylinux.org/issue-tracking?page=1

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

swatts
Active Participant

AristosQueue wrote:


                       

I really hope I'm not being overly luddite about all of this, but frankly, the security and legal guarantees just aren't there... none of these companies is willing to accept legal liability for the security and recoverablity of data.

PS : What the HELL were the backups doing available through a network connection?!!?!?! Those should have been saved to physical media and moved offline once per week at the LATEST.


                   

The failure was having one login and mixing up resilience with backup, you read all the blurb and it sounds as if they are going to look after your data better than you can. In reality it has proven not to be the case.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

Thoric
Trusted Enthusiast

swatts wrote:

                       

I have a feeling that git/mercurial are more suited to branch and merge type projects and I'm not sure this fits into our pattern of doing things. 500 VI project, small change of a type-def affecting 40 VIs massive = merge headache. I could be wrong here, it's just a feeling with no evidence to support it.

                   

From my short experience with git (JIRA / Stash / SourceTree), I can see the DSCC is good for large teams of developers, but that doesn't mean it doesn't work also for single developers / pairs of developers.

Merging is a nightmare in LabVIEW, but I rarely (never?) encounter this issue, and that won't change just because I'm trying out Git. I'm experimenting with it now and I was able to:

1. Quickly create a branch of my repository on the remote server repository from within the issue management tool (JIRA) for sandboxing my experimental code fix. A simple single click!

2. Pull the new branch down to my development PC (SourceTree)

3. Make changes to the code and Commit the changes locally then Push the changes up to the remote server (SourceTree)

4. View the branch and master repositories on the remote system easily (Stash), and see the differences (for text based files only of course) right there in the interface.

5. "Merge" the branch back into the master using SourceTree merge, which essentially used the modified files to overwrite the master files, and pushed the changes up to the server.

It's simple, and exists in two places (plus others if people want to synchronise with it), and I can create offline backups by simply backing up my file structure (which contains the git repository inside it).

So far I don't see any problems, and the integration of issue tracking with the projects and repositories seems seamless.

Thoric (CLA, CLED, CTD and LabVIEW Champion)


Elijah_K
Active Participant

That's terrible Steve! So sorry to hear

Elijah Kerry
NI Director, Software Community
swatts
Active Participant

That's most interesting Rich, distributed source control would have saved me a lot of lost versions, so that's a big plus.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

Thoric
Trusted Enthusiast

No matter what we do, our online data is never safe! Especially liking the box "do you use cloud services operated by meat-based life forms?"

http://www.gliffy.com/_ui/images/examples/example_flowchart_safeData_large.png

Thoric (CLA, CLED, CTD and LabVIEW Champion)


swatts
Active Participant

I thought for a bit about announcing this all to the world, but hopefully my pain will teach us all that NO, NO IT ISN'T

I am going to host my own git server too, let's see what system wins.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

Thoric
Trusted Enthusiast

We don't use any cloud-based storage here, except our email systems through Office365 (purely because managing our own email server was becoming too much for IT). All Git/SubVersion/Mercurial systems have been hosted locally ourselves. Safe. Secure. Backed up nightly to local and off-site storage.

Thoric (CLA, CLED, CTD and LabVIEW Champion)


AristosQueue (NI)
NI Employee (retired)

swatts wrote:

The failure was having one login and mixing up resilience with backup, you read all the blurb and it sounds as if they are going to look after your data better than you can. In reality it has proven not to be the case.


                   

Out of curiosity, do they or any of the cloud sites offer an easy way for you to pull your own backups AND push an overwrite in the event that you need to restore? I haven't really investigated it, but if you have the ability to pull backups of your entire cloud repository easily, that would go a long way to me feeling comfortable with letting them *host* the cloud, just not maintain the cloud.

swatts
Active Participant

That was what I was originally looking for, it was a manual/support operation with codespaces.

The perfect interface for me would have been a button on the repository screen to export the whole repo. When I put my own server in I will be also FTPing a backup and eventually synching it to my local backup.

I really don't want to get involved in IT, need to get myself a Skeletor outfit I think.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

James_McN
Active Participant

swatts wrote:


                       


                   

I did look at Trac but I don't think it supports multiple projects (or at least not easily) which doesn't really work well for consultants!

James Mc
========
CLA and cRIO Fanatic
My writings on LabVIEW Development are at devs.wiresmithtech.com
samsharp99
Member

We recently (in the last 6 months or so) started using CodeSpaces as a replacement to TeamForge. Literally in the last few weeks we started using it in anger as we needed something that could be easily accessed from abroad as a couple of our team are out of the country. Over that time we've put 5-10 applications in there and about the same number of package sources.

Fortunately, because we'd only just started using it that means that we have local copies of everything that was in there but we've lost the change history that went with it.

Very unfortunate and sad that these companies are being extorted like this - I wonder if this is fallout from recent events like the heartbleed bug.

AristosQueue (NI)
NI Employee (retired)

And I wonder how many of them get themselves into financial trouble and "oops we got hacked and lost everything" is an easy out. Hard to prove that they didn't shred the files themselves and bail.

Am I cynical? It would be except that we've seen it happen with these sorts of companies. And these would make great honey-traps for intelligence agencies, who fold up shop when they've acquired what they needed.

My trust in the cloud is minimal. I don't trust Google or Amazon or Microsoft -- what I trust (somewhat) is the massive number of people and governments actively monitoring those companies and their activiities to keep them honest, but they are still largely unregulated.

If there was ever a use case for a "public utility", in my opinion, the cloud is it. A highly regulated monopoly would be much more my preference for this sort of service than a bunch of rogue and competing private companies. Keeping your data secure is only as valuable as the money you are paying them, and when they hold the lifeblood of your company, then it is worth *almost* the entire value of your company to keep that data going. That's the basis for an extortion system of incredible power.

Keep your own servers. Paying a sysadmin is worth the cost.

swatts
Active Participant

AristosQueue wrote:


                       

And I wonder how many of them get themselves into financial trouble and "oops we got hacked and lost everything" is an easy out. Hard to prove that they didn't shred the files themselves and bail.

Exactly my conclusion Mr Mercer and really the reason I put it on the blog. Rather than conspiracy theory I think it is simple expediance. A small cloud based hosting company could be liable for $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ in data. The moment anything bad happens it must be very tempting to close the door and run for the hills.

I'm going the route of a hosted server where I can backup my own data. It just means a whole load of linux pain for the next few weeks.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

justACS
Active Participant

AristosQueue wrote:


                       

If there was ever a use case for a "public utility", in my opinion, the cloud is it. A highly regulated monopoly would be much more my preference for this sort of service than a bunch of rogue and competing private companies. Keeping your data secure is only as valuable as the money you are paying them, and when they hold the lifeblood of your company, then it is worth *almost* the entire value of your company to keep that data going. That's the basis for an extortion system of incredible power.

So instead of rogue and competing companies, you would hand that power to a government?  When ours has been so trustworthy these last few decades?

Keep your own servers. Paying a sysadmin is worth the cost.


                   

This.

AristosQueue (NI)
NI Employee (retired)

niACS wrote:

So instead of rogue and competing companies, you would hand that power to a government? 


                   

Yes.

A) I didn't say anything I put up would be unencrypted. Hosting isn't the same as having.

B) A regulated monopoly is not the same as the government, and it is possible to structure them as inherently antagonistic entities, and I see more benefit than pain in such arrangements.

It's a larger discussion than this forum. But if I was going to have a cloud, that's the model I would choose.

pavanb
Active Participant

Sorry to hear Steve.

Thank you for posting and the community for ideas/experiences. Stash got me to lookup a tool for pair programing /real time collaborative editing, albeit ascii based. http://blog.etherpad.org/2014/04/13/skype-and-google-hangouts-alternative/

M_Peeker
Member

We are using Plan.io for our project management, which has redmine in the background taking care (?) of our repositories. We run Git and Mercurial, but SVN is supported as well. When I saw this post I looked around a bit and asked the guys that are good at this among us. Obviously there is a possibility to download your own weekly backups of everything (this is not just repos but the entire project from all points of view). I'm not telling you that we do this, but after Steves experience I think we'd better start.

Can I download a backup of all my Planio data?

For your security, all data is already encrypted and backed up to an off-site location on a daily basis.

You can create your own backups weekly free of charge via Customer Account → Account Management → Request / Download Backups.

All backups include a full database-dump, SVN and Git dumps, and an archived ZIP file (attachments).



CLA
www.dvel.se
justin.goeres
Member

I'm late to the party on this due to some travel, but I have a sort of timely perspective: I just moved all my personal/private repositories from my own systems to the cloud, after hosting them on my own server for 10+ years.

Note: This is not what we do at JKI (AFAIK). It's what I do for my non-JKI projects, some of which are highly critical to me.

Here's what my system looks like:

  • Cloud hosting of all my repos. I use Kiln, but anything similar works.
  • Daily cron jobs on my local server, which pull all changes from the remotes. (As Fab mentioned earlier, this guarantees that I have a complete and up-to-date copy of all my repos inside my own walls).
  • The local server's data are stored on a RAID array so any single drive failure is non-fatal.
  • Nightly backups from my server to an attached USB drive (via a backup tool on Linux).
  • Monthly rotation of the USB drive into offsite/fireproof storage. (not implemented, on the TODO list and I know how to do it... just haven't set it up yet).

I've done a lot of work developing and working with backup/data protection strategies. Would be glad to discuss over beers at NIWeek. Maybe a CLA Summit presentation topic?

AristosQueue (NI)
NI Employee (retired)

Justin: Do you encrypt what is on the cloud servers? I realize this is your personal work and not your work-work, but if it was for your work, I suspect you have client files and/or trade secret code that you wouldn't want shared, either inadvertantly or deliberately, so I'm curious if you've looked into that aspect.

justin.goeres
Member

That's a really good question that I don't have an answer for.

The closest I can get to an answer is this:

  1. I use ssh exclusively to access the repos, so the data is secure in transit (https is fine too but I'm an ssh key guy).
  2. I do not specifically encrypt the data myself before transfer (other than using ssh, which isn't what you mean).
  3. I do not know whether Kiln encrypts the data on their servers. This page talks about Fog Creek's infrastructure but doesn't directly address the question of whether repos themselves are encrypted, and only claims that Fog Creek employees can't access it.
  4. Even if the answer to #3 is "Kiln encrypts the data," I am trusting Fog Creek to provide (and secure) the encryption keys.

As a point of comparison, I never encrypted the data when I hosted it myself, either. I always trusted the security of my systems to keep intruders out (I have a good NIWeek story about that ). By moving my data to the cloud I've basically said, "I trust Fog Creek to secure their systems as well as I secure my own, and I also trust them not to look at my stuff."

You raise interesting points, for sure.

AristosQueue (NI)
NI Employee (retired)

<snark>So, you give Fog Creek the same level of trust that you give your bank. That's cool.

Out of curiosity... do you know some of them personally? Or do you have faith in the government oversight of the industry? </snark>

justin.goeres wrote:

You raise interesting points, for sure.


                   

I'm sorry if I make anyone else paranoid about the cloud, but I've come to the conclusion that the programming community as a whole -- far beyond just the LabVIEW user base -- is woefully insufficiently paranoid and as a result we're creating products that are woefully unsecure. I do not believe we should be building Top Secret level security into every product we build, but we're a long way from even just the basic "lock the door when you get out of the car" level. And if we start thinking about how our comrades are handling our data, maybe it'll make us all collectively better at handling our users' data.

justin.goeres
Member

AristosQueue wrote:


                       

<snark>So, you give Fog Creek the same level of trust that you give your bank. That's cool.

Out of curiosity... do you know some of them personally? Or do you have faith in the government oversight of the industry? </snark>

Well, I don't give them access to my money. But other than that, your point stands . (And I do know some of them personally, but obviously not all of them.)

AristosQueue wrote:


                       

I've come to the conclusion that the programming community as a whole -- far beyond just the LabVIEW user base -- is woefully insufficiently paranoid and as a result we're creating products that are woefully unsecure.


                   

I agree 100% with this.

But what's the alternative? I mean, I know I'm not qualified to admin a secure system, and my customers (again, personal projects and I'm not speaking for JKI) would be fools to trust me to protect their business-critical data.

In other words, my bet is that my known incompetence is a greater threat than the collective risk of Fog Creek's competence/honesty and ability to secure their systems.

But can I prove that? Nope. I just look around at the rest of the herd and figure that thousands of people trust Fog Creek (or GitHub or BeanStalk or CodeSpaces), so I should too.

Related:

AristosQueue (NI)
NI Employee (retired)

justin.goeres wrote:

I just look around at the rest of the herd and figure that thousands of people trust Fog Creek (or GitHub or BeanStalk or CodeSpaces), so I should too.


                   

I have recently realized that many people don't realize they're trusting Fog Creek (etc). "The site has a password, right? And you can't access the site without the password. Thus, my stuff is secure as long as no one has my password." Assume for a moment that the password really is secure... they do not have enough mental map of how the systems work to see that there's nothing about the password to inherently mean "this must be an encrypting password". They just assume it is. The comments on various forums in the wake of Snowden have been very revealing about the real nature of the problem. Following the instinct of the crowd is a good idea when the crowd is even asking the question... but I don't think the question is even being asked right now.

swatts
Active Participant

This is a fascinating discussion, when I get a second I'll peruse it again and have a think about best practices. I took a part of my job and outsourced it to "experts", I don't know why I am continually surprised that the "experts" are as clueless as the rest of us.

I've loaded up my virtual linux box with subversion and ftp, written some fancy ssh LabVIEW and woohoo! I have security where I need it (i.e.shifting the backups through ftp doesn't need to be that secure). It's obscure and unpublished and not got a great big Extortion Target at on it. Hopefully I can now build my bug reporting stuff on top of the web server.

Linux is a frustrating, wonderful system!

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

Daklu
Active Participant

justin.goeres wrote:

I just look around at the rest of the herd and figure that thousands of people trust Fog Creek (or GitHub or BeanStalk or CodeSpaces), so I should too.


I realize this was written tongue-in-cheek, but I have an issue with the (imo) overuse of "should" in common language.

It's one thing to say, "Fog Creek has experts who know more about securing an IT infrastructure than I do and thousands of other people trust them; therefore I choose to trust them as well."

It's another thing entirely to say, "Fog Creek has experts who know more about securing an IT infrastructure than I do and thousands of other people trust them; therefore I should trust them as well."

"I should" indicates there is a much stronger justificiation for trusting Fog Creek than "I choose."  It implies anyone deciding not to trust Fog Creek is making an incorrect decision.  Even though Fog Creek does have experts with more knowledge than I and thousands of others do trust them, it does not logically follow that these facts make them trustworthy.

(BTW, that documentary was totally faked.  Lemmings don't really do that.)

justin.goeres
Member

If you want to play the Let's Be Pedantic game, I wasn't writing "tonque-in-cheek." I was writing informally.

swatts
Active Participant

So here's an update after a lot of 15 hour days I have got my VPS repository working pretty sweet (it's astonishingly quick as a bonus!). I have also centralised my project/document numbering system and am working on a bugzilla light system for bug reporting. Should be done by the end of the week.

It all backs up at the touch of a button too.

If anyone wants the details or if you think it would make a good blog article or 5 give me a prod.

Thanks for all your input as usual. Soon I can sleep again.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

swatts
Active Participant

Bug reporting screen, work in progress

BugReportingScreen.png

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

justACS
Active Participant

I'd say this is worth some blog posts and possibly a CLA Summit presentation (which you have to give in Austin, of course, since I don't think I'll have budget to come to Europe for a while).  If you're going to go to the trouble of dying for our sins, the least we should do is pay attention to what's killing you.

gobshite
Member

Remember Steve, it must pass requirement zero ......

swatts
Active Participant

I feel a meme coming on...corporate wear, presentations, watch this space!

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile