.NET Framework ??

Try going away from microsoft and use some real software, written for
purpose other than taking your money.

i
 
Joel Kolstad wrote:
"Mike Monett" <No@email.adr> wrote in message
news:Xns9905BDEFF407FNoemailadr@208.49.80.251...
Whenever I'm looking for applications, I disregard any application written
in .NET, and continue looking for code written by professionals.

I'm willing to bet you a dollar that -- at least if you're running Windows XP
or Vista -- you're using plenty of .Net programs without even knowing it.

You can argue that the overhead of .Net -- and similar technologies such as
Java or (to a much lesser extent) Python -- are not worth their (sometimes
quite significant) overhead, but there are some objectives advantages to what
.Net is attempting to do. Not that that implies Microsoft has necessarily
done a particularly good job (I wouldn't really know, having only ever written
"toy" programs in .Net), but hey -- at least they're trying to advance
technology while they take over the universe! :)

One of the authors in the LTspice forum generated a MOSFET model program
using .NET. He recently changed it to a stand-alone exe. This shows .NET is
not needed, and how easy it is to get rid of it.

Note that producing a stand-alone .exe doesn't imply that .Net is gone -- it
could have just been bundled up in the executable.

.Net certainly isn't "needed," but neither is Windows Vista or XP, or
Microsoft Outlook or Word or any other program out there. How easy or hard it
is to get rid of .Net is largely a function of the size, complexity, and scope
of the program that's written -- "hello world" is trivially ported to any
language/framework you want, after all.
Its better said that "hello world" is more easily ported to .NET and as
the programs get larger and demand more services (think databases,
specialized networking, etc.) the odds increase that .NET will _not_
have support for it.

Any 'write once, run anywhere' apps have to target the lowest common
denominator API set. With something like Java, which has a JRE for a
wide range of platforms, it was worthwhile for developers to add the
hooks for underlying services. For .NET, developers just asked "What's
the point?" .NET originally ws planned to support only Windows (ignoring
the Mono project). Its a 'write once, run in one place' runtime. So all
the developers asked themselves, "If I've already got my stuff running
on Windows (native .EXE), what does all that additional pain and
suffering buy me?".


--
Paul Hovnanian mailto:paul@Hovnanian.com
------------------------------------------------------------------
The large print giveth and the small print taketh away.
-- Tom Waits
 
On Mon, 02 Apr 2007 02:55:56 GMT, "Tom Del Rosso"
<td_01@att.net.invalid> wrote:

"Mike Monett" <No@email.adr> wrote in message
news:Xns9905BDEFF407FNoemailadr@208.49.80.251

Anyone who writes software in .NET is demonstrating their amateur
status and corresponding incompetence. I don't need to waste my time
with code written by amateurs, and .NET clearly shows who they are.

In fairness to the programmers, it's probably their managers telling them to
use it.
I use .net, in fact i pushed it my last job. I told the managers to
use it and I am a c++ programmer, go figure. I guess I am an amateur.

What do you do for a living?
 
On Mon, 02 Apr 2007 03:08:47 GMT, "Homer J Simpson"
<nobody@nowhere.com> wrote:

"Jamie" <jamie_ka1lpa_not_valid_after_ka1lpa_@charter.net> wrote in message
news:g2ZPh.116$xB7.92@newsfe12.lga...

I use C++ and Delphi mostly my self (Win32) on both.

Hasn't Borland jumped aboard also?
Borland lost it years ago, its all reactive, a bit like linux these
days actually.
 
"The Real Andy" <therealandy@nospam.com> wrote in message
news:s86413divaandtrqh9f87kedb8f76he55c@4ax.com
On Mon, 02 Apr 2007 02:55:56 GMT, "Tom Del Rosso"
td_01@att.net.invalid> wrote:

"Mike Monett" <No@email.adr> wrote in message
news:Xns9905BDEFF407FNoemailadr@208.49.80.251

Anyone who writes software in .NET is demonstrating their amateur
status and corresponding incompetence. I don't need to waste my time
with code written by amateurs, and .NET clearly shows who they are.

In fairness to the programmers, it's probably their managers telling
them to use it.

I use .net, in fact i pushed it my last job. I told the managers to
use it and I am a c++ programmer, go figure. I guess I am an amateur.
Then go argue with the people making comments about programmers and have
your manager meet me in the alley.


What do you do for a living?
--

Reply in group, but if emailing add another
zero, and remove the last word.
 
"Paul Hovnanian P.E." <paul@hovnanian.com> wrote in message
news:4611A189.72384B6@hovnanian.com...
Any 'write once, run anywhere' apps have to target the lowest common
denominator API set. With something like Java, which has a JRE for a
wide range of platforms, it was worthwhile for developers to add the
hooks for underlying services.
The "hooks" for .Net come in the form of COM "objects." Pretty much every
major Windows application out there has a COM interface available (including
many of the fancier schematic capture/PCB layout tools, and even high-end RF
design packages such as Microwave Office). So there's really a philosophical
difference in design there: Java tries to have a higher-level "base" API --
but you're left with a myriad of different protocols for interfacing to
anything not included in that API --, whereas .Net tries to have a purposely
lower-level API and Microsoft dictates than any extensions should come in the
form of COM objects. :) (This is perhaps the main reason Microsoft Office
remains notably more powerful than OpenOffice, even though the VAST majority
of MSO users will probably *never* use those features.)

I do agree that how well these design philosophies work is largely a
significant function of program size and complexity, though.

So all
the developers asked themselves, "If I've already got my stuff running
on Windows (native .EXE), what does all that additional pain and
suffering buy me?".
The unfortunate answer to that is, "it meets some pointy-haired boss's
checklist for acceptable software purchases." :-( I have a friend who's a
programmer and he says they see this all the time -- they have a very nice
software package (it's very much COM-enabled -- .Net apps can interface with
it just as readily as Excel or Visual BASIC or Java can!) written in C++, and
yet they'll see some company's checklist for software purchases that requires
the software be written in some .Net language. Apparently PHBs attend some
Microsoft .Net infomercial seminar and buy into the hype that anything written
in .Net is good, anything else is junk... when in actuality, of course,
there's plenty of junk to be found regradless of the language chosen.

Said friend refers to "dot net" as "dot crap." :)

---Joel
 
On Tue, 03 Apr 2007 19:10:19 +1000, The Real Andy wrote:
On Mon, 02 Apr 2007 02:55:56 GMT, "Tom Del Rosso"
"Mike Monett" <No@email.adr> wrote in message

Anyone who writes software in .NET is demonstrating their amateur
status and corresponding incompetence. I don't need to waste my time
with code written by amateurs, and .NET clearly shows who they are.

In fairness to the programmers, it's probably their managers telling them to
use it.

I use .net, in fact i pushed it my last job. I told the managers to
use it and I am a c++ programmer, go figure. I guess I am an amateur.

What do you do for a living?
Solve Real-World Problems. :)

Thanks!
Rich
 
"Mike Monett" <No@email.adr> wrote in message
news:Xns9906B4AB439DNoemailadr@208.49.80.251...
"Joel Kolstad" <JKolstad71HatesSpam@yahoo.com> wrote:

"Mike Monett" <No@email.adr> wrote in message
news:Xns9905BDEFF407FNoemailadr@208.49.80.251.

Whenever I'm looking for applications, I disregard any
application written in .NET, and continue looking for code
written by professionals.

I'm willing to bet you a dollar that - at least if you're running
Windows XP or Vista - you're using plenty of .Net programs without
even knowing it.

Nope. Win98SE. No need for XP with all the problems. Biggest
advantage is I can use XCOPY32 and copy every file to a backup disk.
Takes only about 60 seconds, so I do it often.

Also, System File Checker is much better in Win98. I can verify
every critical file on the hard disk and ensure nobody downgraded a
dll of changed anything in the kernel.

You can argue that the overhead of .Net - and similar technologies
such as Java or (to a much lesser extent) Python - are not worth
their (sometimes quite significant) overhead, but there are some
objectives advantages to what .Net is attempting to do. Not that
that implies Microsoft has necessarily done a particularly good
job (I wouldn't really know, having only ever written "toy"
programs in .Net), but hey - at least they're trying to advance
technology while they take over the universe! :)

One of the authors in the LTspice forum generated a MOSFET model
program using .NET. He recently changed it to a stand-alone exe.
This shows .NET is not needed, and how easy it is to get rid of
it.

Note that producing a stand-alone .exe doesn't imply that .Net is
gone it could have just been bundled up in the executable.

Not likely. The exe is about the same size as before. And it loads
much faster.

[...]

----Joel

MS .NET is junk. The concept sucks, and the execution is typical MS
crap.

I guess if you hire all these brilliant software jocks, you gotta
let them earn their keep. But if they'd only keep it amongst
themselves instead of forcing the rest of the planet to put up with
it.

Regards,

Mike Monett
Not come across the '.NET' rubbish until now.
Bought a USB hard disc this afternoon. Had on it's CD a "One click backup!!"
program.
240Mbytes of *.NET program crap later I find that this "program" can
transfer no more than a whole single file at a time!. Hell I can do a whole
directory via normal drag-n-drop. Deleted the whole mess.
Yes. Rank amateurs all the way down the line.




--
Posted via a free Usenet account from http://www.teranews.com
 
John,

One of my favorite programs for "backup" is Beyond Compare, by Scooter
Software. Even though it's really meant more of "synchronizing" than "backing
up," it's powerful enough to do the later much better than many of junkware
packages that come with, e.g., new hard drives.

I'm quite confident it's not written in .Net. :)

---Joel
 
On Sat, 17 Feb 2007 20:15:43 -0500, Jamie
<jamie_ka1lpa_not_valid_after_ka1lpa_@charter.net> wrote:

The Real Andy wrote:
On Fri, 16 Feb 2007 20:32:09 GMT, "Genome" <mrspamizgood@yahoo.co.uk
wrote:


"Jim Thompson" <To-Email-Use-The-Envelope-Icon@My-Web-Site.com> wrote in
message news:k61ct2t60icdg2geptsgpg9uevea065jn8@4ax.com...

I downloaded a calendar program that also installed ".NET Framework"

I uninstalled the calendar... pure crap.

Does ".NET Framework" have any usefulness, or should I uninstall it as
well?

...Jim Thompson

No..... your computer worked before it was installed your computer will work
after you get rid of it.


Yes, but more and more people are using .net. Once installed, thats
it.


I know three fifths of bugger all about this stuff but .NET is some new
MicroCrap crap which is another layer of shit MicroCrap are layering on top
of their other crap to 'consolidate' programming under Windows.


Its the same as java.

Sorry it's not.
but that doesn't mean it's any good either.
The code behind maybe different, but the model is essentially the
same. Both are managed languages, both compile to an intermediate
language, both are JIT compiled at runtime.

The fact that this calendar program had to install bits or all of the .NET
framework just so it could tell you what day of the week it is and was shit
at doing that just tells you it is a shit piece of software...... probably
written in Visual NET or some other MicroCrap Crap....


Actually, its probably more robust and secure than any borland app.

Bad comment to make, Borland apps have nothing to do with security
issues, it's the people writing the code and that's where the problem
is. .NET Robust? i don't think so. Cripple and slow ? yes/.
..net is robust in the same sense as java. Sure you can write unsafe
code (in fact MS even use the term unsafe in the languages) but then
you are being plain mad.

Cripple and Slow? You quite clearly have never used it in a commercial
environment. I do, in an enterprise environment, and I can assure you
it not cripple and slow.

Microsoft loves people like you, easy brain washing.
Microsoft love people like me because we spend money on their
products. I don't buy the MS marketing, I buy the products because I
can develop in a shorter timeframe and my code is a lot more robust.
In a commercial environment that means a lot, the business gets what
they want on time, I get paid and they then give me more work because
I can deliver. Its pretty simple.

You can now get Borland Turbo Delphi for free and write applications for
commercial use. That uses the .NET framework BUT produces stand alone
executables that do not need to install .NET crap on your computer to work.


Borland Delphi just wraps up MFC anyway, so therefore it is just a
layer of crap sitting upon windows. Might as well just do it in c#.
You can get the Visual Studio Express for free now too, and you are
even allowed to write commercial code with it. If you dont want .net,
then use the free Visual STudio express c++ edition and write an
unmanaged c++ MFC application.
Sorry, you don't know what you're talking about. Now you may be
talking about the VCL. That is Borlands class library and it has no
MFC what's so ever.
Sorry, I meant Win32. However in saying that, when Borland dropped OWL
they began paying MS a licence fee to use MFC. I have not touched
Borland for some time now, so I don't really know what the deal is
these days.

I did just knock back a job that involved Borland C++ however. Also
knocked back a Delphi job too, both in preference for a job doing
enterprise .net systems.

Fortunately Homer has me on plonk so he won't be able to comment on the crap
I just wrote.


No great loss there.


DNA

I am not associated with the IAENG. ;-)

Before going off on a tangent, I suggest you get the
facts straight.
You must be Borland user.
 
Chuck Harris wrote:

The Real Andy wrote:

.net is robust in the same sense as java. Sure you can write unsafe
code (in fact MS even use the term unsafe in the languages) but then
you are being plain mad.


Java runs in a safe and secure "sandbox" it cannot gain access to anything
you don't give it access to. It is safe by design... that said, I don't
write Java, or C#... I prefer the embedded world.


Cripple and Slow? You quite clearly have never used it in a commercial
environment. I do, in an enterprise environment, and I can assure you
it not cripple and slow.

Microsoft loves people like you, easy brain washing.


Microsoft love people like me because we spend money on their
products. I don't buy the MS marketing, I buy the products because I
can develop in a shorter timeframe and my code is a lot more robust.
In a commercial environment that means a lot, the business gets what
they want on time, I get paid and they then give me more work because
I can deliver. Its pretty simple.


I never found that to be true. I have found that Microsoft's applications
tend to confound my efforts to write good software. I would much rather
work
with opensource.

The problem as I see it, is MiracleSlop has such a monopoly on seats in
the computer world that finding opensource work is difficult. It is worth
the effort, though, as it is so much easier and more reliable.

-Chuck
Well, finally some one that knows something!


--
"I'm never wrong, once i thought i was, but was mistaken"
Real Programmers Do things like this.
http://webpages.charter.net/jamie_5
 
On Sun, 18 Feb 2007 11:07:28 -0500, Jamie
<jamie_ka1lpa_not_valid_after_ka1lpa_@charter.net> wrote:

The Real Andy wrote:

On Sat, 17 Feb 2007 20:15:43 -0500, Jamie
jamie_ka1lpa_not_valid_after_ka1lpa_@charter.net> wrote:


Its the same as java.

Sorry it's not.
but that doesn't mean it's any good either.


The code behind maybe different, but the model is essentially the
same. Both are managed languages, both compile to an intermediate
language, both are JIT compiled at runtime.
Crap , crap and double crap.
.NET is not the same in the background.


.net is robust in the same sense as java. Sure you can write unsafe
code (in fact MS even use the term unsafe in the languages) but then
you are being plain mad.

Cripple and Slow? You quite clearly have never used it in a commercial
environment. I do, in an enterprise environment, and I can assure you
it not cripple and slow.
Typical young coder.. I've been around the barn far longer than
you think.
Ha ha ha, i take that as a compliment.

I feel sorry for people like you getting lead down that dark
path.
The path to finacial success. Its funny how when people lose an
argument they resort to insulting people.


Microsoft love people like me because we spend money on their
products. I don't buy the MS marketing, I buy the products because I
can develop in a shorter timeframe and my code is a lot more robust.
In a commercial environment that means a lot, the business gets what
they want on time, I get paid and they then give me more work because
I can deliver. Its pretty simple.

Yeah i know, it's pretty simple. MS loves simple users.
glad you finally admitted it.


Borland Delphi just wraps up MFC anyway, so therefore it is just a
layer of crap sitting upon windows. Might as well just do it in c#.
You can get the Visual Studio Express for free now too, and you are
even allowed to write commercial code with it. If you dont want .net,
then use the free Visual STudio express c++ edition and write an
unmanaged c++ MFC application.

Sorry, you don't know what you're talking about. Now you may be
talking about the VCL. That is Borlands class library and it has no
MFC what's so ever.


Sorry, I meant Win32. However in saying that, when Borland dropped OWL
they began paying MS a licence fee to use MFC. I have not touched
Borland for some time now, so I don't really know what the deal is
these days.
Borland has never implemented any of MFC in any of the produces to the
end user. I don't know where you got this information from but it's
clearly incorrect. Borland has there own set of class libraries that
simply sprang from the early days of OWL. they call it the VCL now.
VCL = Visual Control Library.
Ahh, remember when Borland dropped OWL, they licenced MFC? Remeber
that there was some condition regarding the approval that prevented
Borland from continuing with MFC? It was about this time that everyone
stopped using broland products. VCL is completely different.



I did just knock back a job that involved Borland C++ however. Also
knocked back a Delphi job too, both in preference for a job doing
enterprise .net systems.

Don't feel bad, you did them a favor by not letting them hire you.
Its funny how when people lose an argument they resort to insulting
people.


You must be Borland user.
Yes i am , i'm also a MS VC++, VS , user.
write code for Windows Mobile etc..
I've been in the field sense the day's of
punch card computers and have fallow the
product line's of Borland, MS, Symantic, Watcom
to say a few.
Excellent, you must be very successful in your job.

Now that we're done with our pissing contest, go back
to your slow .NET MS controlled applications. and hope
you have fun writing slow bloated code.
I still cant see where you get slow and bloated from. You really
should spend some time using .net, you will be surprised.

Just think, since you're suck a good MS customer, maybe they
won't make you wait for months to fix a crash/serious bug that
arises in your code due to a frame work error that only MS can
fix.
Hasn't happened to me yet, and I have been using .net for 4 years now.
 
The Real Andy wrote:
Jamie wrote:
[...]
Borland has never implemented any of MFC in any of the produces to
the end user. I don't know where you got this information from but
it's clearly incorrect. Borland has there own set of class libraries
that simply sprang from the early days of OWL. they call it the VCL
now. VCL = Visual Control Library.

Ahh, remember when Borland dropped OWL, they licenced MFC? Remeber
that there was some condition regarding the approval that prevented
Borland from continuing with MFC? It was about this time that everyone
stopped using broland products. VCL is completely different.
IIRC, Borland licenced MFC around the time of C++ 5. It was only included
for compatibility reasons so that you could compile your MFC apps with BC++,
and it didn't always work 100% (due to the dependence of MFC on MSVC
undocumented extensions). OWL was still very much the focus of BC++ 5.

BCB was the next product in line, and this used VCL instead of OWL. OWL
wasn't dropped until BCB5, though it was (like MFC) included only for
compiling older projects with the newer compiler. MFC compatibility is
included all the way through the BCB line AFAIK, though it's for compiling
only. You can't use the designer to make MFC GUIs.

There was a big debate a couple of years ago in one of the borland NG's
about when/why/how Borland "lost" the C++ market. Their marketshare had
dropped substantially long before BCB was released, or OWL was ditched for
that matter. The basic problem was versions 4 and 5 of BC++ were pretty bad
(especially the IDE), and OWL was simply not being updated and fixed very
quickly. While they picked up some new people with the VCL (though more with
Delphi than BCB) they'd lost their momentum in the C++ market, and with MS's
aggressive pushing of MSVC (and later poaching of a significant number of
key Borland developers) it was pretty much impossible to come back again.

[...]
I still cant see where you get slow and bloated from. You really
should spend some time using .net, you will be surprised.
You've never tried running a .NET application on a machine with 512MB of
RAM, I guess? :)

Cold start times for .NET applications are terrible. Even a simple winforms
"Hello World" takes somewhere between 15-25 seconds (depending on the phase
of the moon) to start where the CLR isn't already loaded (for example, after
a reboot, or if you have 512 MB of RAM as the CLR files fall out of the
cache pretty quickly). Warm start times are better, though still in the
order of 3-4 seconds. Compared to, say, firefox, which cold-starts in a bit
under 10 seconds and warm-starts basically instantly. While this isn't a
problem for applications that run all the time, it prevents .NET from being
used for small utility applications - I don't want to wait for 3 or so
seconds while calc or notepad starts, for example. Similarly, meory usage is
terrible for .NET apps - a "Hello World" takes up around 7MB of RAM.

As MS forces more and more .NET stuff on people, these problems wil become
less significant since the framework will always be loaded - it's not really
a nice solution, though, as all that means is that you've got bloat loaded
all the time (taking up space that could be used for better purposes), not
just when you run an app that needs it.

Performance-wise, it's a bit hard to compare. For complex FP stuff , .NET is
slower (at least on K7 Athlons and P4's), no questions there. It depends on
the exact algorithm, but I've typically seen anything from a 20% to a 100%
increase in computation time required. CPU-intensive stuff in general is
slower, but most of the tests I've done are FP-based things so I can't give
a decent range for other types. For I/O intensive stuff such as throwing
files through sockets, most of the time is spent outside the application, so
although the application may only be operating at half the speed of a native
app it doesn't really matter. I could go on and on ... basically, CLR is
significantly slower than C++ code compiled to native code, but unless your
application is CPU-bound in your code, it doesn't matter a whole lot.

The biggest problem I've found is that, although you can code a .NET to be
fast, it's not encouraged. A lot of the things that make .NET development
easy also make the resulting application slow. The end result of this is
things like ATI's CCC - massive memory use, slow startup times, laggy
interface, etc etc. While I know that CCC is not representitive of a "good"
..NET app, it is representitive of a vast majority of .NET apps.

Just think, since you're suck a good MS customer, maybe they
won't make you wait for months to fix a crash/serious bug that
arises in your code due to a frame work error that only MS can
fix.

Hasn't happened to me yet, and I have been using .net for 4 years now.
While I can't say I've run into a problem that *can't* be worked around,
some bugs (mainly in the 1.0 and 1.1 era) have required significant and/or
ugly workarounds (which had to be developed using trial and error). One of
the things I love about Borland's VCL is that you get the source code, which
means you can easily trace into the VCL and see where something is going
wrong. This makes it much easier to develop workarounds compared to
developing them for .NET where if you tell MS "{x} doesn't work correctly"
you just get the response "don't do {x}, and it may be fixed in a future
service pack". At which point, if doing {x} is important to your code, you
have to start the tedious process of black-box debugging the framework.
Reflector can help a bit if you've narrowed it down, but it's still nowhere
as nice as being able to trace into the code.

--
Michael Brown
Add michael@ to emboss.co.nz - My inbox is always open
 
On Mon, 19 Feb 2007 13:18:52 +1100, "Michael Brown"
<see@signature.below> wrote:

The Real Andy wrote:
Jamie wrote:
[...]
Borland has never implemented any of MFC in any of the produces to
the end user. I don't know where you got this information from but
it's clearly incorrect. Borland has there own set of class libraries
that simply sprang from the early days of OWL. they call it the VCL
now. VCL = Visual Control Library.

Ahh, remember when Borland dropped OWL, they licenced MFC? Remeber
that there was some condition regarding the approval that prevented
Borland from continuing with MFC? It was about this time that everyone
stopped using broland products. VCL is completely different.

IIRC, Borland licenced MFC around the time of C++ 5. It was only included
for compatibility reasons so that you could compile your MFC apps with BC++,
and it didn't always work 100% (due to the dependence of MFC on MSVC
undocumented extensions). OWL was still very much the focus of BC++ 5.

BCB was the next product in line, and this used VCL instead of OWL. OWL
wasn't dropped until BCB5, though it was (like MFC) included only for
compiling older projects with the newer compiler. MFC compatibility is
included all the way through the BCB line AFAIK, though it's for compiling
only. You can't use the designer to make MFC GUIs.

There was a big debate a couple of years ago in one of the borland NG's
about when/why/how Borland "lost" the C++ market. Their marketshare had
dropped substantially long before BCB was released, or OWL was ditched for
that matter. The basic problem was versions 4 and 5 of BC++ were pretty bad
(especially the IDE), and OWL was simply not being updated and fixed very
quickly. While they picked up some new people with the VCL (though more with
Delphi than BCB) they'd lost their momentum in the C++ market, and with MS's
aggressive pushing of MSVC (and later poaching of a significant number of
key Borland developers) it was pretty much impossible to come back again.
Sorry, I screwed up once again :( I meant to write that MFC licence
condition was valid only if Borland dropped OWL. That was some time
ago now, I cant really remember what happened.

[...]
I still cant see where you get slow and bloated from. You really
should spend some time using .net, you will be surprised.

You've never tried running a .NET application on a machine with 512MB of
RAM, I guess? :)
Most of our customers run 512mb ram (on PC's supplied by us).

Cold start times for .NET applications are terrible. Even a simple winforms
"Hello World" takes somewhere between 15-25 seconds (depending on the phase
of the moon) to start where the CLR isn't already loaded (for example, after
a reboot, or if you have 512 MB of RAM as the CLR files fall out of the
cache pretty quickly). Warm start times are better, though still in the
order of 3-4 seconds. Compared to, say, firefox, which cold-starts in a bit
under 10 seconds and warm-starts basically instantly. While this isn't a
problem for applications that run all the time, it prevents .NET from being
used for small utility applications - I don't want to wait for 3 or so
seconds while calc or notepad starts, for example. Similarly, meory usage is
terrible for .NET apps - a "Hello World" takes up around 7MB of RAM.
Ok, so the lowest CPU speed I have app's running on is around 2GHz.
These are AU$400 PC's shipped with 512mb RAM, running XPSP2. I have
never seen one take more than a few seconds to load the framework,
even after a reboot. BTW, there is tonnes of information out there on
how to reduce cold start times of app's. In fact MSDN mag did an
article on such not so long ago.

As MS forces more and more .NET stuff on people, these problems wil become
less significant since the framework will always be loaded - it's not really
a nice solution, though, as all that means is that you've got bloat loaded
all the time (taking up space that could be used for better purposes), not
just when you run an app that needs it.
Hardware is cheap, development time is quick, results are relatively
robust (you can still write bad code). The whole idea with .net and
java is that you sacrifice some resources for the huge benefits gained
from safe, secure and reliable code.

Performance-wise, it's a bit hard to compare. For complex FP stuff , .NET is
slower (at least on K7 Athlons and P4's), no questions there. It depends on
the exact algorithm, but I've typically seen anything from a 20% to a 100%
increase in computation time required. CPU-intensive stuff in general is
slower, but most of the tests I've done are FP-based things so I can't give
a decent range for other types. For I/O intensive stuff such as throwing
files through sockets, most of the time is spent outside the application, so
although the application may only be operating at half the speed of a native
app it doesn't really matter. I could go on and on ... basically, CLR is
significantly slower than C++ code compiled to native code, but unless your
application is CPU-bound in your code, it doesn't matter a whole lot.
Fair point. I don't do FP stuff. Things such as casting classes to
binary buffers can be slow, but by marshalling to unmanaged and back
it is real fast. What's great about that is the marshaller will not
let you exceed the bounds of the array. You cant overrun a buffer. Big
plus. Does not matter how bad you code, you cant do it. It won't let
you. This is just one of the benefits. I challenge anyone to tell me
that they have not had a buffer overrun, even if it was found before
production.

The biggest problem I've found is that, although you can code a .NET to be
fast, it's not encouraged. A lot of the things that make .NET development
easy also make the resulting application slow. The end result of this is
things like ATI's CCC - massive memory use, slow startup times, laggy
interface, etc etc. While I know that CCC is not representitive of a "good"
.NET app, it is representitive of a vast majority of .NET apps.
I wonder what agent uses. Currently in my process list (Sysinternals)
I have Agent at 10MB, followed by explorer(Win GUI shell) at 13MB, and
Visual Studio 2005 at 40Mb. SO I wonder what language agent is written
in? I have seen C++ app's (Borland and MS) that have been more bloated
than C# app's.

Just think, since you're suck a good MS customer, maybe they
won't make you wait for months to fix a crash/serious bug that
arises in your code due to a frame work error that only MS can
fix.

Hasn't happened to me yet, and I have been using .net for 4 years now.

While I can't say I've run into a problem that *can't* be worked around,
some bugs (mainly in the 1.0 and 1.1 era) have required significant and/or
ugly workarounds (which had to be developed using trial and error). One of
the things I love about Borland's VCL is that you get the source code, which
means you can easily trace into the VCL and see where something is going
wrong. This makes it much easier to develop workarounds compared to
developing them for .NET where if you tell MS "{x} doesn't work correctly"
you just get the response "don't do {x}, and it may be fixed in a future
service pack". At which point, if doing {x} is important to your code, you
have to start the tedious process of black-box debugging the framework.
Reflector can help a bit if you've narrowed it down, but it's still nowhere
as nice as being able to trace into the code.
Maybe that's where I differ. I don't want the source code. I don't
care about it. The last thing I want to do is spend time learning my
way through some library. I just don't have time. My experiences with
..net have been good. What I cant do with it, I can do with Interop,
and its rare that I need to go there. This is especially so with .net
2. All I really care about is giving the business what it wants, and
if I can do that, on time with a product that does not have bugs, then
I keep the business happy. If I keep the business happy I get
rewarded, both financially and with more work.
 
The Real Andy wrote:
Michael Brown wrote:
[...]
Sorry, I screwed up once again :( I meant to write that MFC licence
condition was valid only if Borland dropped OWL. That was some time
ago now, I cant really remember what happened.
Initially this was the case, which is why it took so long for BC++ to become
MFC compatible (Watcom, Symantic, etc had MFC support years before Borland).
MS finally relented and allowed Borland to use both - I have no idea why.
Presumably money was involved somewhere.

[...]
Cold start times for .NET applications are terrible. Even a simple
winforms "Hello World" takes somewhere between 15-25 seconds
(depending on the phase of the moon) to start where the CLR isn't
already loaded (for example, after a reboot, or if you have 512 MB
of RAM as the CLR files fall out of the cache pretty quickly). Warm
start times are better, though still in the order of 3-4 seconds.
Compared to, say, firefox, which cold-starts in a bit under 10
seconds and warm-starts basically instantly. While this isn't a
problem for applications that run all the time, it prevents .NET
from being used for small utility applications - I don't want to
wait for 3 or so seconds while calc or notepad starts, for example.
Similarly, meory usage is terrible for .NET apps - a "Hello World"
takes up around 7MB of RAM.

Ok, so the lowest CPU speed I have app's running on is around 2GHz.
These are AU$400 PC's shipped with 512mb RAM, running XPSP2. I have
never seen one take more than a few seconds to load the framework,
even after a reboot.
Are you sure that you're actually testing a "cold boot" scenario? For
example, if you use ATI graphics drivers, the framework gets loaded during
boot. The warm and cold start times are about the same on my main computer
because of this (and somewhere around 1-1.5 seconds due to better hardware).
The times I mentioned above are on my "clean test" machine, an XP1700 with
512 MB RAM. This is a cleaner machine than most when it comes to stuff in
RAM, since I'm only testing a single app at a time (more or less just XP +
..NET frameworks). On another machine that I use (P4 ~2GHz, 512 MB RAM, but
with lots of junk loaded like a virus scanner, firewall, web/email clients,
etc ) I observe similar or worse times - worse because when I exit a .NET
application the framework tends to fall out of the cache pretty quickly as
memory is tight. Resulting in times closer to cold starts than warm starts.

BTW, there is tonnes of information out there on
how to reduce cold start times of app's. In fact MSDN mag did an
article on such not so long ago.
Yes, there are things you can do to improve cold (and warm) start times ...
but these times are for a "Hello World" app, which doesn't leave many
options for improvement.

[...]
Hardware is cheap, development time is quick, results are relatively
robust (you can still write bad code). The whole idea with .net and
java is that you sacrifice some resources for the huge benefits gained
from safe, secure and reliable code.
Safe and secure, maybe, but reliable? Allowing people to be sloppy when
coding is just asking for reliability issues further down the line. The
"ideal" language (IMO) should be strictly typed (no magic float-to-int
conversions for you!), require explicit memory allocation and deallocation
(with a garbage collector there solely to slap you if you forget) and allow
you to get your hands dirty and bit-bash if you want to. Ideally it should
also treat fundamental types as fundamental types - an integer is an
integer, not an object. Basically, make the programmer think about what
(s)he is writing, not just spew out a whole lot of code and make random
changes until it works. Unfortunately, the trend seems to be away from every
one of these aspects and towards allowing - even encouraging - poor
programming practices. While in the short term this may seem like a good
solution, 5 years down the line you'll end up with millions of lines of
poorly written code just held together by comments such as "I don't know why
the following line is needed but things break if it's removed".

/me gets off his soapbox :)

[...]
Fair point. I don't do FP stuff. Things such as casting classes to
binary buffers can be slow, but by marshalling to unmanaged and back
it is real fast. What's great about that is the marshaller will not
let you exceed the bounds of the array. You cant overrun a buffer. Big
plus. Does not matter how bad you code, you cant do it. It won't let
you. This is just one of the benefits. I challenge anyone to tell me
that they have not had a buffer overrun, even if it was found before
production.
I can't recall the last time I had a buffer overrun in any of my Delphi
code. I have had the odd one when mixing Delphi and inline assembler (the
overflow in the assembler part), but assembler is assembler and those things
happen :) In my C++ code, I've had the odd one in parts which I've been
aggressively optimizing (as in "we need to squeeze more performance out of
this procedure, new is slowing us down, so lets coalesce all the unrelated
new calls into a single malloc and bitbash some things around" aggressive)
but not AFAICR in "normal" code. In these cases, a managed language would
have prevented the buffer overflows simply by not allowing me to optimize so
aggressively. Not exactly a plus in that case.

The reason for this IMO is that I've always been quite conscious of buffer
overruns and the like, coming from an assembly and software security
background. Of course, I can't say I've never had one - my memory doesn't go
back far enough, and I'm pretty certain that when I was starting programming
I overflowed a few buffers (either accidentally or on purpose ...).
Defensive programming beats a safety net any day of the week - a (prevented)
buffer overflow in a .NET application usually results in the application
crashing or being terminated. While this is better than a buffer overflow,
it's better still that the app figures out beforehand that the buffer is too
small and handles it gracefully. Additionally, most C/C++ compilers now
support buffer overflow detection with guard words. While this results in
guaranteed termination of the program (rather than possible recovery with an
exception), the result is not all that different to an exception firing
where you did not anticipate it happening.

The biggest problem I've found is that, although you can code a .NET
to be fast, it's not encouraged. A lot of the things that make .NET
development easy also make the resulting application slow. The end
result of this is things like ATI's CCC - massive memory use, slow
startup times, laggy interface, etc etc. While I know that CCC is
not representitive of a "good" .NET app, it is representitive of a
vast majority of .NET apps.

I wonder what agent uses.
As far as I can tell, it uses MSVC (and wxWindows for the gui). It's a
combination of C and C++. Though this is just from a quick glance through
the executable using notepad - it may be being crafty :)

Currently in my process list (Sysinternals)
I have Agent at 10MB, followed by explorer(Win GUI shell) at 13MB, and
Visual Studio 2005 at 40Mb.
Heh, Explorer on this computer is at 17 MB - tis what you get when you don't
log off for 9 days ... Biggest memory users at the moment are Opera (103
MB), Delphi 7 (35 MB), OE (31 MB). However, I wouldn't consider Opera
bloated - I've got about 25 tabs open across 2 windows, and then the cached
renderings for the previous page, plus cached files ... Delphi and OE are
also quite "busy", and the memory is being put to good use. The award for
bloat (being defined as memory used for no good purpose) on my computer goes
to the annoying ATI tray icon that just won't go away. It's currently using
up 7 MB to display an icon in the tray that if I double-click on it starts
up CCC.

[...]
While I can't say I've run into a problem that *can't* be worked
around, some bugs (mainly in the 1.0 and 1.1 era) have required
significant and/or ugly workarounds (which had to be developed using
trial and error). One of the things I love about Borland's VCL is
that you get the source code, which means you can easily trace into
the VCL and see where something is going wrong. This makes it much
easier to develop workarounds compared to developing them for .NET
where if you tell MS "{x} doesn't work correctly" you just get the
response "don't do {x}, and it may be fixed in a future service
pack". At which point, if doing {x} is important to your code, you
have to start the tedious process of black-box debugging the
framework. Reflector can help a bit if you've narrowed it down, but
it's still nowhere as nice as being able to trace into the code.

Maybe that's where I differ. I don't want the source code. I don't
care about it. The last thing I want to do is spend time learning my
way through some library. I just don't have time.
I'm coming somewhat from the same direction but with different experiences
.... I don't have time to be trying to second-guess what's happening inside a
library. If an application isn't behaving as it should, there's either a bug
in my code or a bug in the library (or very occasionally a bug in the
compiler). Maybe I've just been unlucky with hitting lots of .NET framework
bugs, but I'm pretty sure I've spent more time in the last couple of years
trying to isolate and work around .NET bugs than I've spent tracing through
the VCL source code (and it's not due to spending a lack of time with VCL).
Since I've written quite a few Delphi components I already knew (or had to
know) what a typical VCL component looks and acts like. Since the VCL
follows the same rules as the components I write, tracing through it is
pretty easy.

[...]

--
Michael Brown
Add michael@ to emboss.co.nz - My inbox is always open
 
On Tue, 20 Feb 2007 03:21:06 +1100, "Michael Brown"
<see@signature.below> wrote:

The Real Andy wrote:
Michael Brown wrote:
[...]
%<
[...]
Cold start times for .NET applications are terrible. Even a simple
winforms "Hello World" takes somewhere between 15-25 seconds
(depending on the phase of the moon) to start where the CLR isn't
already loaded (for example, after a reboot, or if you have 512 MB
of RAM as the CLR files fall out of the cache pretty quickly). Warm
start times are better, though still in the order of 3-4 seconds.
Compared to, say, firefox, which cold-starts in a bit under 10
seconds and warm-starts basically instantly. While this isn't a
problem for applications that run all the time, it prevents .NET
from being used for small utility applications - I don't want to
wait for 3 or so seconds while calc or notepad starts, for example.
Similarly, meory usage is terrible for .NET apps - a "Hello World"
takes up around 7MB of RAM.

Ok, so the lowest CPU speed I have app's running on is around 2GHz.
These are AU$400 PC's shipped with 512mb RAM, running XPSP2. I have
never seen one take more than a few seconds to load the framework,
even after a reboot.

Are you sure that you're actually testing a "cold boot" scenario? For
example, if you use ATI graphics drivers, the framework gets loaded during
boot. The warm and cold start times are about the same on my main computer
because of this (and somewhere around 1-1.5 seconds due to better hardware).
The times I mentioned above are on my "clean test" machine, an XP1700 with
512 MB RAM. This is a cleaner machine than most when it comes to stuff in
RAM, since I'm only testing a single app at a time (more or less just XP +
.NET frameworks). On another machine that I use (P4 ~2GHz, 512 MB RAM, but
with lots of junk loaded like a virus scanner, firewall, web/email clients,
etc ) I observe similar or worse times - worse because when I exit a .NET
application the framework tends to fall out of the cache pretty quickly as
memory is tight. Resulting in times closer to cold starts than warm starts.
IIRC the boxes i am using have an onboard video compliments of intel.

All my testing is done on clean installs. A PC that you use for dev is
always going to be bloated with crap, and any app you run on it is
going to be slow. Same for running CAD apps.

However, I do have a few .net applications that i do run on my dev
machine for support reasons, some of which are quite big, and the
start times of all of those are quite good to. Mind you, I am going to
go and test cold start times of those now!!


BTW, there is tonnes of information out there on
how to reduce cold start times of app's. In fact MSDN mag did an
article on such not so long ago.

Yes, there are things you can do to improve cold (and warm) start times ...
but these times are for a "Hello World" app, which doesn't leave many
options for improvement.
If you are doing .net dev, its worth reading this MSDN mag article.
http://msdn.microsoft.com/msdnmag/issues/06/02/CLRInsideOut/default.aspx

There are some other good ones around too.


Hardware is cheap, development time is quick, results are relatively
robust (you can still write bad code). The whole idea with .net and
java is that you sacrifice some resources for the huge benefits gained
from safe, secure and reliable code.

Safe and secure, maybe, but reliable? Allowing people to be sloppy when
coding is just asking for reliability issues further down the line. The
"ideal" language (IMO) should be strictly typed (no magic float-to-int
conversions for you!), require explicit memory allocation and deallocation
(with a garbage collector there solely to slap you if you forget) and allow
you to get your hands dirty and bit-bash if you want to. Ideally it should
also treat fundamental types as fundamental types - an integer is an
integer, not an object. Basically, make the programmer think about what
(s)he is writing, not just spew out a whole lot of code and make random
changes until it works. Unfortunately, the trend seems to be away from every
one of these aspects and towards allowing - even encouraging - poor
programming practices. While in the short term this may seem like a good
solution, 5 years down the line you'll end up with millions of lines of
poorly written code just held together by comments such as "I don't know why
the following line is needed but things break if it's removed".

/me gets off his soapbox :)
You will always get programmers that write bad code, in any language.
..net and java are helping the cause by taking care of that aspect as
much as possible.

[...]
Fair point. I don't do FP stuff. Things such as casting classes to
binary buffers can be slow, but by marshalling to unmanaged and back
it is real fast. What's great about that is the marshaller will not
let you exceed the bounds of the array. You cant overrun a buffer. Big
plus. Does not matter how bad you code, you cant do it. It won't let
you. This is just one of the benefits. I challenge anyone to tell me
that they have not had a buffer overrun, even if it was found before
production.

I can't recall the last time I had a buffer overrun in any of my Delphi
code. I have had the odd one when mixing Delphi and inline assembler (the
overflow in the assembler part), but assembler is assembler and those things
happen :) In my C++ code, I've had the odd one in parts which I've been
aggressively optimizing (as in "we need to squeeze more performance out of
this procedure, new is slowing us down, so lets coalesce all the unrelated
new calls into a single malloc and bitbash some things around" aggressive)
but not AFAICR in "normal" code. In these cases, a managed language would
have prevented the buffer overflows simply by not allowing me to optimize so
aggressively. Not exactly a plus in that case.

The reason for this IMO is that I've always been quite conscious of buffer
overruns and the like, coming from an assembly and software security
background. Of course, I can't say I've never had one - my memory doesn't go
back far enough, and I'm pretty certain that when I was starting programming
I overflowed a few buffers (either accidentally or on purpose ...).
Defensive programming beats a safety net any day of the week - a (prevented)
buffer overflow in a .NET application usually results in the application
crashing or being terminated. While this is better than a buffer overflow,
it's better still that the app figures out beforehand that the buffer is too
small and handles it gracefully. Additionally, most C/C++ compilers now
support buffer overflow detection with guard words. While this results in
guaranteed termination of the program (rather than possible recovery with an
exception), the result is not all that different to an exception firing
where you did not anticipate it happening.

The biggest problem I've found is that, although you can code a .NET
to be fast, it's not encouraged. A lot of the things that make .NET
development easy also make the resulting application slow. The end
result of this is things like ATI's CCC - massive memory use, slow
startup times, laggy interface, etc etc. While I know that CCC is
not representitive of a "good" .NET app, it is representitive of a
vast majority of .NET apps.

I wonder what agent uses.

As far as I can tell, it uses MSVC (and wxWindows for the gui). It's a
combination of C and C++. Though this is just from a quick glance through
the executable using notepad - it may be being crafty :)

Currently in my process list (Sysinternals)
I have Agent at 10MB, followed by explorer(Win GUI shell) at 13MB, and
Visual Studio 2005 at 40Mb.

Heh, Explorer on this computer is at 17 MB - tis what you get when you don't
log off for 9 days ... Biggest memory users at the moment are Opera (103
MB), Delphi 7 (35 MB), OE (31 MB). However, I wouldn't consider Opera
bloated - I've got about 25 tabs open across 2 windows, and then the cached
renderings for the previous page, plus cached files ... Delphi and OE are
also quite "busy", and the memory is being put to good use. The award for
bloat (being defined as memory used for no good purpose) on my computer goes
to the annoying ATI tray icon that just won't go away. It's currently using
up 7 MB to display an icon in the tray that if I double-click on it starts
up CCC.
Those values were for Vista.

While I can't say I've run into a problem that *can't* be worked
around, some bugs (mainly in the 1.0 and 1.1 era) have required
significant and/or ugly workarounds (which had to be developed using
trial and error). One of the things I love about Borland's VCL is
that you get the source code, which means you can easily trace into
the VCL and see where something is going wrong. This makes it much
easier to develop workarounds compared to developing them for .NET
where if you tell MS "{x} doesn't work correctly" you just get the
response "don't do {x}, and it may be fixed in a future service
pack". At which point, if doing {x} is important to your code, you
have to start the tedious process of black-box debugging the
framework. Reflector can help a bit if you've narrowed it down, but
it's still nowhere as nice as being able to trace into the code.

Maybe that's where I differ. I don't want the source code. I don't
care about it. The last thing I want to do is spend time learning my
way through some library. I just don't have time.

I'm coming somewhat from the same direction but with different experiences
... I don't have time to be trying to second-guess what's happening inside a
library. If an application isn't behaving as it should, there's either a bug
in my code or a bug in the library (or very occasionally a bug in the
compiler). Maybe I've just been unlucky with hitting lots of .NET framework
bugs, but I'm pretty sure I've spent more time in the last couple of years
trying to isolate and work around .NET bugs than I've spent tracing through
the VCL source code (and it's not due to spending a lack of time with VCL).
Since I've written quite a few Delphi components I already knew (or had to
know) what a typical VCL component looks and acts like. Since the VCL
follows the same rules as the components I write, tracing through it is
pretty easy.

[...]
 

Welcome to EDABoard.com

Sponsor

Back
Top