|
but it runs so slow...
Know way too many languages... master of none!
|
|
|
|
|
Jason: I feel your pain. I've written lots of different things over the last 25 years and what used to take 500 bytes in assembly now takes 500k (not counting the 20MB framework you ALSO need). The assembly used to be elegant, tight, every byte on the stack accounted for and ran like lightning even on the most modest of machines. The new stuff? It's a crap shoot. You no longer REALLY know what's going on since you only wrote two lines of code and the framework magically does the rest behind the scenes.
But is that really so bad? What used to take days or weeks I can now do in hours -- And much of it is less error prone! The whole concept of super high level languages and wrappers around wrappers that wrap yet something else may not be efficient, but it no longer HAS to be be in most cases. Hardware has gotten so cheap that, as much as I like optimizing code, it's often cheaper to just buy a bigger box. Don't get me wrong: There's a plethora of clueless application developers (versus programmers!) out there that take this concept to the extreme and just slap something together with total disregard for performance.
I agree with many of your points:
- RAD trades efficiency for rapid development cycles. Absolutely! But that's the point, isn't it?
- MS does release technologies at a pace that's impossible to keep up with. Their "let's see if this one sticks" approach drives me nuts! But stick to their core technologies and it's actually pretty solid. I've seen more backward compatibility out of MS than out of most companies.
As for the product at hand: Sharepoint...
We use it. It's a love/hate relationship. From a developer's perspective I think it's a pain in the neck. From an end-user standpoint: They "just get it". Show me anything in the same price range that has such a consistent UI (think users used to Windows/Word/Outlook/Etc.) and a similar feature set and I might be tempted to switch. I sure haven't seen it, and it's not because I haven't looked.
Peter
|
|
|
|
|
Peter: Well Said. I agree with EVERY word you said through and through.
On your point about Microsoft being pretty solid for "backward compat - I agree but as of late I keep finding this "solidarity" going south. Sharepoint? Good product but I'm ultimately switching my loyalties.
Note alot of my frustration comes from what I see customers having to go through more and more lately. I'll have you know - I have really loved working with Microsoft technologies for a long time, but lately they have been giving me more reasons to dislike them.
BTW - I'm a fan of RAD dev.. especially for GUI development and integration into existing foundations. I've written my own RAD systems that literally wrote tons of code. Shoot, I'm not about rewriting the wheel all the time, I'm a firm believer that a SOLID API lays the foundation so you can concentrate on the applications - where the API is a CONSTANT, applications are not. when the API changes so frequently (the calls and such) it's hard to keep up. Honestly, if the "guts" perform decent, I don't care about how they work. But I would be so much less frustrated if API calls stayed more constant and were added to. If this kind of API management was done properly... then applications could more easily be migrated, often without doing anything other than recompiling against the new API. If you told me you wrote an API that ran on ANY OS - and was always being added to... I'd love it. The problem is when calls you made before no longer work. for example... Microsoft did some good stuff in win32 api (and some bloopers) but mostly good. when they added a new feature, the call was usually named like "windowcreate_ex" for extended... and more than once the code behind the API functions was changed but the results were the same so you only needed a simple recompile.
If Microsoft has one BASE API that was endian independant and platform independant, performed decent, and compiled to binary on any platform (at least all their own OS's) I'd be pretty interested. If they didn't keep dropping support for old commands during its life and instead just changed the code underneath as improvements were added, and added new function calls for "recommended" new versions of calls for new software being written, I'd be very very impressed. Then we could get really really skilled at that foundation, then we could write REALLY awesome RAD software that wrote code for that platform so we get the job done faster, and then customers could keep building on their infrastructures rather than simply replacing "tried and true" tested systems for new stuff that hasn't been proven.
Know way too many languages... master of none!
|
|
|
|
|
JasonPSage wrote: If Microsoft has one BASE API that was endian independant and platform independant, performed decent, and compiled to binary on any platform
LOL! I think they did try that. Remember NT on Alpha? That worked well! At this point, the PC based platform is so ubiquitous that I don't think it's an issue anymore. You really want to run Windows on a Sun box?
Your comment on the Win32 API is spot on: It used to be rock solid. What worked on Win16 even worked on Win32 and it all remained compatible. Although they goofed up once in a while, stuff usually got added instead of changed. But the other side of that coin is that you have to have a huge amount of code in place to support those old functions in light of new hardware and new functionality. Net result: Things get slow and bloated. Isn't that where we started?
Also, some of the old functions just CANNOT be made secure. Look at something as simple as strcpy... How many functions do you know that just take a pointer and start writing there without a decent size check? And you know that stuff exists on any platform, not just Windows. With more an more developers that don't know their mallocs from their frees, it's a disaster waiting to happen.
At some point, you have to address this stuff. You have to get rid of some of your original (good or bad!) design decisions. You have to remove some of the bloat. And the market says you have to make it easier for less experienced programmers to write decent code in a reasonable timeframe. This gives you no choice but to drop some compatibility. This applies to the base API, but also to everything else.
They took this to the extreme when they released Vista and all hell broke loose. Now people (developers) are getting used to it and it's getting better. Windows 7 is the optimization of what they built in Vista (doing exactly what you said: Change the code behind the APIs, but not the APIs themselves) and in Windows 8 we'll all get it right.
The dotnet stuff addresses the last issues and they've actually done a pretty decent job. It's gone through some rapid changes in the first few versions, but I'd rather them do that early on.
And if you really want to run your "old" stuff, you know what? Run it on an old environment. We'll even give you a free virtual one. It's not entirely bad?
BTW: Change happens. The wipers from my old Dodge don't fit my new Bimmer either. And I thought the ones from the Dodge were fine; they removed the rain, didn't they?
Peter
|
|
|
|
|
I agree with you Peter. I'm not against new stuff - I'm for it.
I do disagree on the strcpy stuff though, because frankly that code isn't bloated - its lean and mean and in the right hands its plenty secure. Microsoft isn't the only one who knows a malloc from a free
I do like and embrace the idea of having options: strcpy "raw" as well as having newer more secure options with documentation that specifies which is recommended but allows developers to use their own judgement.
I'm good with change too. BTW - NT on Alpha was a decent approach and I was a true fan boy of NT and was for years. Server 2008 is pretty decent also.
--Jason
Know way too many languages... master of none!
|
|
|
|
|
well said friend you r quite right about MS release technologies so fast. List we just start using vs 2008 in past 6 month and now vs 2010 is coming
|
|
|
|
|
Which is fine - 2008 and now 2010... the historical fact is - code you got working in 2008 might not recompile/work in 2010. Which makes it's so you aren't building on what you learn, you're too often scrapping what you had figured out.
I like to figure out new stuff... say I write a cool app that does XYZ... I should be free now to write a new application that does something different without being forced to constantly revisit and rework, retest the XYZ thing every "iteration" "product release" because someone else dictates as much.
I was an old turbo pascal developer years ago when Borland had a lot of awesome engineers writing their stuff. I left all that for Microsoft technology. Now, there is a language called freepascal which is pascal but it's completely object oriented like delphi, compiles on iphone, mac, linux, unix, dec alpha, nintendo and various embedded systems - and the API's have pretty much stayed backwards compatible and there even is a complete GUI IDE environment - where you write a gui once and it works on all the above OS's if they have a GUI system... even FreeBSD and OS/X for the people who really don't like change LOL. It even has a Delphi compatible mode where its like 99% dead on compatible with Delphi making delphi programs capable of compiling on other hardware and operating systems (when the code isn't to OS specific e.g. tons on win32 calls)
I wrote a web server with it that blows the doors off of IIS, apache and Lighttd - and unlike java's pcode - (java is not really compiled) - freepascal compiles to binary on every platform it supports. It's as fast and in many cases faster than microsoft's C++ stuff and runs circles around .net for speed.
I've used that language for years and it's funny - people still think pascal is a learner's language and they don't seem to know its object oriented. I've been heckled over my admiration for it often - but I laugh when my code runs faster and on more platforms than any other system I know (that does binary). Java does good for portable scripting, as does python, php and I think Ruby is honerably in the scripted/portable solution ranks now.
Ah well - the fact is - I'm an enthusiast and I love coding, I just like to build on what I've done and not have to start over each each year or two - especially because most decent software projects I write take a year or more to complete (the thinkstuff anyways).
--Jason
Know way too many languages... master of none!
|
|
|
|
|
|
JasonPSage wrote: Code I wrote 10 years ago on Unix still runs on Unix and Linux today. Code I wrote two years ago is already "outdated" in Microsoft Circles...
It's only a recent issue IMO, but yeah I totally agree. They're overdoing it. I don't see a real huge boost to productivity overall either compared to what I saw in the VB6 days.
Oh be careful with saying anything wrong about MS here. The kiddies are lurking.
|
|
|
|
|
Well i am agree with you but, frankly i didn't like my code to be working after 10 or 15 years
If our code works after that much time we will soon to be work less
Think about this if your car work very good after 20 years you may not buy new one.
Every product should have there life span. Just my opinion
|
|
|
|
|
The Microsoft ODBC technologies that have worked for years don't even work anymore on the new systems...
Really? I think you wrote this without thinking. It works for me.
Nuclear launch detected
|
|
|
|
|
So, help me on this one.
I have a legacy vb6 application that does all kinds of awesome things on ODBC databases. I've heard about and read about and seeked and search for MDAC8, as it's supposed to work on Vista. Have you had luck with this?
Further more, on the 64bit roll out of Vista, how do you proceed?
I'm not trying to be smart here - I haven't had any luck here and became frustrated.
There are literally tens of thousands of lines of tried and true code that allows me to copy databases from one to another like sharepoint but its easier. I can use it to monitor multiple databases remotely, launch programs and scripts when systems go down/connectivity lost, and likewise when it returns. I can monitor 250 databases with it... I can literally connect oto oracle, copy the entire datamodel to multiple target databases of different platforms, then export data models to excel, mysql, postgres, import and export to text: you name it... etc etc. It's a pretty awesome utility.... I'd hate to chuck it, rewriting it isn't really an option. It took on/off development over five years to get it where it is: its quite mature.
Know way too many languages... master of none!
|
|
|
|
|
Yes, ODBC 32-bit works normally on Windows Vista x64.
x64 applications works as well, only I tried only with SQL Server.
But everything works as normal.
I'm talking about C++ and ODBC API here. Not sure how VB6 is affected by MDAC.
I'm sure there is a MDAC 2.8 I think for 64-bit - I dont know if there is ODBC x64 support.
You can run application in 32-bit mode, even if is x64.
On Vista, I bet there's something related to UAC. You may try to read/write ODBC config files or so and you get all kinds of access denied or so errors, I don't know.
But this can be easily overcome by writing Vista specific code, or devising a CoCreateInstanceAsAdmin code sequences when UAC is needed. My 0.02.
Nuclear launch detected
|
|
|
|
|
Thanx for the response. I haven't much luck here - but I could try again.
Know way too many languages... master of none!
|
|
|
|
|
Everytime I INVEST time and energy into a Microsoft Product, they discontinue it just as I'm mastering it.
I'm mastering Windows API. (I try to do so every day since 1996). And that's not outdated.
Nuclear launch detected
|
|
|
|
|
xacc.ideIronScheme - 1.0 beta 3 - out now! ((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x))
|
|
|
|
|
I don't understand why have you received several 1 votes for saying that you don't have a plan to use it?! What are they trying to tell? Like.. You *must* use it? So, let me repeat: "I am never planning to use SharePoint".
You cannot just convey your opinion without these idiots graying your post? Some people are just completely screwed up.
It is a crappy thing, but it's life -^ Carlo Pallini
|
|
|
|
|
Rajesh R Subramanian wrote: You cannot just convey your opinion without these idiots graying your post?
I haven't voted, but IMHO he did not express any opinions in his post - he merely made his vote public.
|
|
|
|
|
Nemanja Trifunovic wrote: I haven't voted, but IMHO he did not express any opinions in his post - he merely made his vote public.
And how does that justify the ACs acting like children?
|
|
|
|
|
I like the concept of it and seeing it is a 'product' it is not bad at all however:
- in order to use it well, you need customizations and a very good setup. (it's still missing features.)
- you need to have designated people maintaining it.
- it's not always that fast.
|
|
|
|
|
I agree with your comment about Sharepoint.
Know way to many languages... master of none!
|
|
|
|
|
V. wrote: Stop smoking so you can: Enjoy longer the money you save.
How do you enjoy money that you save. If you enjoy your money then you need to be spending it. Unless you save your money in a hole in your mattress then you might enjoy it in another way .
Vita est usquequaque virtus victus ut plenus. Ego non sum semper iustus tamen Ego sum nunquam nefas!
|
|
|
|
|
We built a significant SharePoint (WSS 2.0) Portal Server for a customer project about 4 years ago. The customer never bought, so we started using it internally.
While all of the criticisms of SharePoint I've read seem to have elements of accuracy, in our organization, several hundred engineers have come to rely on it for daily functions.
With very little customization we're using it effectively to support Scrum for software development, ITIL Incident and Problem management and general collaboration.
In our Scrum use we've created three custom templates (one for the Product Team Site, one for the Sprint, and one for Customer meetings).
Does it do everything we want? No. But applying a Lean Six Sigma Pareto analysis to common collaboration problems, we solved a good 80% of our needs in about 2 hours of template work just using the GUI.
We have done some work on custom web parts and I have found that to be functional object model, but then my idea of "bad programming environment" is Clipper, on a Z-184, in a tent in a Shamal in Saudi Arabia.
All in all, we find it useful, and will continue to extend and expand our use.
|
|
|
|
|
I'd be very interested in finding out how you use sharepoint for scrum.
Is it a certain template? Where can you find it? Did you do custom dev on it in order to get it to work?
Do you need a certain version of sharepoint?
Any information on this would be great.
Many thanks.
|
|
|
|
|
We used SharePoint 2003 because that was the install available.
We made three custom templates using the SharePoint GUI, no custom programming.
We created Product Site template based on a Team Site. This contains Product persistent artificats, the Product Backlog, the Failure Mode Effects Analysis (FMEA, its an LSS tool) and some other things that are Product centric.
We created a Sprint Template based on a Meeting Workspace. This is where we put the Sprint Backlog and the artificat the Team uses for Scrum working inside a Sprint. We have lists for Since Last, Before Next, Obstacles and Burn Down that get updated at each Scrum (we have distributed Scrum teams, so this is really, really helpful). Other lists include Sprint Tasks (things that need to happen to deliver the Sprint Features), Sprint Events (planning meetings for key dates), Sprint Documents (non code things, all code goes in TFS), and Lessons Learned (we're pretty serious about continuous learning, so each Sprint has a Lessons Learned activity that is deliberate and recorded).
We created a "Gap Week" site which we use between Sprints as Product Training \ Sprint Planning support. We aren't able to devote a team to a product for more than one or two Sprints, so we've adapted the Gap Week to transition new Team members based on the expected skill sets we need, availability and the relative priority of any Product compared to all our other Products. The Gap Week allows us to train the new Team Members, review the Product Backlog, and catch up on required training or do specific technical training for the upcoming sprint. (The Gap Week started out at 5 work days, but we're looking at cutting it down to 3 in order to go faster. We think we can do this because we've done a lot of Product Domain training and now most people are familiar with most of the Products in our space.)
We created a custom webpart (programming task here) that rolls up lists from sites for consolidated views. The one we have is pretty basic and we'll probably going to Sprint our custom webparts Product later this year to make some updates.
|
|
|
|
|