101010.pl is one of the many independent Mastodon servers you can use to participate in the fediverse.
101010.pl czyli najstarszy polski serwer Mastodon. Posiadamy wpisy do 2048 znaków.

Server stats:

487
active users

#cs

1 post1 participant0 posts today
Replied in thread

@cstross “People who actually want a personal computer they can program are a niche market, albeit vastly larger than in 1982”

The roots of the #Mega65 lay in #CS #education with #students not being able to grasp the basics not because of lack of intelligence, but through the lack of exposure to basic computing hardware.

“By insulating new computer science and IT students from how computers really work, we may well be disadvantaging them, by preventing them from learning how a computer really works. It's quite the same idea as starting a mechanic on a simple old car, instead of on a nuclear submarine: make the important details visible so that they can be learnt.” — Paul Gardner-Stephen

<c65gs.blogspot.com/2015/12/is->

@swelljoe @mos_8502

c65gs.blogspot.comIs this the first academic slide presentation using an 8-bit computer since the 80's?Who knows, but it was fun: But my message is quite serious: By insulating new computer science and IT students from how computers reall...

#linux #cs #computer #science #directory #fileSystem

Yesterday I extracted the contents of an initrd.img that came packaged in a Fedora distribution. From kernel.org I learned this mini-os can be used by the kernel during the bootup process.

I guess I could have mounted the directory to play with the tools it is configured with. But I realized I do not know the difference between a directory and a file system.

Please help me understand what is the difference between these two concepts.

I've posted a question on cs stackexchange as I think there is a typo in the textbook exercise on λ2 church numerals and operations.

It says "Add" should be

Add ≡λm,n : Nat . λα : ∗. λf : α →α. λx : Nat . mαf(nαf x)

but I think it should be (Nat changed to α)

Add ≡λm,n : Nat . λα : ∗. λf : α →α. λx : α . mαf(nαf x)

I'd welcome guidance.

more context here:
cs.stackexchange.com/questions

When I was in #CS grad school, back in the early 1990s, #wavelets were hot in 3D volumetric CG—oh, those SIGGRAPH symposia on the topic. At the same time in #EE, loads of papers were published on their efficacy in DSP. Just about everyone in EE and CS seemed to have published at least one paper on wavelets. Fun times. But the current state of wavelet academic #research seemed to have dried up.

I don't quite understand why wavelet transform has not supplanted Fourier transform in many #engineering and #computing application domains, considering its estimable time-frequency locality and its prodigious multi-resolution analysis capabilities, compared to Fourier analysis.

I am but a mere "maths carpenter". So, what am I missing, I wonder.

Sometimes, #deconvolution is used by #CS folks to mitigate noise and distortion in an image, provided the characteristic function of the interference source can be measured (or modelled).

I wonder if #radar #EE folks have tried deconvolving the reflected signal with a measured (or modelled) topography of the operating area, so as to cure the ills caused by the ground clutter.

Unfortunately, my horse power computer is now offline. And I am in another city so no access to it. It's likely that there has been a #poweroutage which is now very common in my country #iran

I am doing experiments with #wakegp to see if my simple method for parsimony pressure is effective. Till now, it seems that its effect is very little. I'm thinking of other methods for parsimony pressure such as bucketing and tournament selection(for size instead of fitness).

I am expecting to deliver results in summer, God willing.

It vexes me that UI books written by current IT practitioners inevitably descend into API-call-fest—a code dump, as it were. This sort of presentation is ineffective. In fact, it is childish.

A typical #programmer in IT who designs and implements a non-trivial UI, webtop or desktop, is seasoned, experienced; he knows how to use an API. But he does not necessarily have a background in usability:

PSYCHOLOGY
• The #psychology of visual and tactile perceptions
• The psychology of human-computer interaction #HCI
• The design and administration of psychological experiments on interaction and usability
• The design of the interaction flow

LIBRARY SCIENCE
• The design of the underlying information architecture
• The design of information layout

VISUAL ARTS
• The effective use of colour
• The effective use of font

There are tonnes of other usability-related subjects that fall way outside of modern CS curricula. The #UI books should aspire to teach these cross-discipline subjects on #usability to #CS and #IT practitioners.

Sure, code samples and screenshots of some popular UI framework would be helpful. But the main thrust of these books must be usability, with the organisation that assumes tech-savvy engineers and programmers, not novices.

I was an #EE undergrad, when I backed into #CS. This was the age when assembly was the JavaScript of the day and structured programming was the state-of-the-fart. So, it was a jolt, when I first encountered Test-Driven Development #TDD in the early 2000s, thanks to the luminaries like Kent Beck, et al. Brilliant stuff!

But, at the risk of being drawn and quartered, I do say that TDD is a bit of a misnomer and an overstatement. Honestly, answer this: do EEs really create the hardware test suits first, before conducting analysis and design, and do CSs really create the software test suits first, before performing analysis and design?

No, we do not! We analyse the problem, we study the requirements, we search for inspiration in the literature, we sip our tea or coffee, we select a candidate solution, we create a plausible design, we implement a prototype, then we test—yes, THEN, WE TEST—if our wild imaginations have any basis in reality at all.

So, I prefer a more down-to-earth descriptionof TDD, which says, "test early, test often, test as much as practicable", not "test first, because".

Here is a partial list of things in #CS and #IT that truly vex me:

• The use of the nonsensical mm/dd/yy date format and the equally nonsensical 12-hour time format in programming, especially when the leading zero is dropped, instead of the ISO UTC date-time format
• The incomplete, inconsistent implementations of time zones, the 19th Century anachronism
• Continued reliance on the OSs that trace their design roots back to the early 1960s
• The web browser’s futile, even if valiant, attempts to replicate the functionalities of the OS in the name of greater security and easier deployment, instead of supporting modern, standardised security and delivery mechanisms directly in the OS, thereby eliminating unnecessary complexities and inefficiencies
• The mushrooming of web GUI frameworks that are touted as "new" but in truth are mere rehashes of the "old" one that came out three months ago, instead of standardising on a standardised look-and-feel that promotes usability, portability, and maintainability
• The coders who, with no training in psychology, have the temerity to design nonsensical UIs purely for fancy effects, instead of aiming for consistency, predictability, and usability
• The practice of shoehorning modern programming concepts and facilities into old languages that were designed when the hardware occupied an entire floor and the software occupied a small deck of punchcards, instead of letting those old languages retire with dignity befitting their history

To my #CS #professor friends on MathsTodon.xyz: what #programming languages are you teaching as the "proper first language" to your freshmen, and what are the delights and dismays thereof?

Some 30y ago, I taught C and ML to the CS undergrads—my two favourite classic languages. But, boy, they were a handful to teach to novices. But then, you lot might well have an easier task now, given that a typical CS freshman today knows several languages already (at least #Python and/or #JavaScript🤦‍♂️), by the time they enter the uni.

I think I need to go back to learning programming languages by reading books and then coding.
I remember I learnt Java that way, all the way back in 2014, and everything is still imprinted in my brain.
Going to do that for C++ now. Need to master that before September this year.