Programming thread -

SIGSEGV

Segmentation fault (core dumped)
True & Honest Fan
kiwifarms.net
What do you think, Is the self taught programmer a meme or it can be done?
It's not a meme at all. I took an AP class in high school that taught me Java, but taught myself C and C++ in college. As long as you can understand basic logic and control flow, there's no reason that you shouldn't be able to teach yourself to program.
 

Spamy the Bot

Notorious Moon
kiwifarms.net
It's not a meme at all. I took an AP class in high school that taught me Java, but taught myself C and C++ in college. As long as you can understand basic logic and control flow, there's no reason that you shouldn't be able to teach yourself to program.
That's pretty cool. I will try looking into it. How do you practice this thing? Even I am not autistic enough just to make word processors for practice.
 

SIGSEGV

Segmentation fault (core dumped)
True & Honest Fan
kiwifarms.net
That's pretty cool. I will try looking into it. How do you practice this thing? Even I am not autistic enough just to make word processors for practice.
The nice thing about teaching myself while I was in college was being able to use homework assignments and projects as practice. I also write C++ for a living, so that helps. For someone that's just getting started, I wouldn't really focus on practicing so much as I would on getting a "how to write C#" book and following along with the exercises/examples it provides. Familiarize yourself with the language first, then figure out what you'd like to do with it.
 

Spamy the Bot

Notorious Moon
kiwifarms.net
The nice thing about teaching myself while I was in college was being able to use homework assignments and projects as practice. I also write C++ for a living, so that helps. For someone that's just getting started, I wouldn't really focus on practicing so much as I would on getting a "how to write C#" book and following along with the exercises/examples it provides. Familiarize yourself with the language first, then figure out what you'd like to do with it.
Okay. I will do that! Also sorry for spamming up the thread.
Gotta keep and open mind and be curious.
 

Kosher Salt

(((NaCl)))
kiwifarms.net
The pitfall of self-taught programming is that you're likely to teach yourself some bad coding habits that you'll have to break later. Of course that's also just part of the learning process, so it's not entirely bad. Even if you know how to code, though, if you start at someplace new you're going to have to learn their coding standards and how to find your way around their toolchain and codebase.
 

Spamy the Bot

Notorious Moon
kiwifarms.net
The pitfall of self-taught programming is that you're likely to teach yourself some bad coding habits that you'll have to break later. Of course that's also just part of the learning process, so it's not entirely bad. Even if you know how to code, though, if you start at someplace new you're going to have to learn their coding standards and how to find your way around their toolchain and codebase.
Yeah, corporate knowledge is always there. Overall it's a fascinating world, I want to be more well read in computers and tech. anyway.
I mean sure I won't get to build a cool robot gf with rocket boots, but at least I won't be the guy who treats computers like magic boxes.
I saw that much way too much in my life.
 

Citation Checking Project

Hellstate man trapped in parents' house
True & Honest Fan
kiwifarms.net
Is everything that can even slightly be construed as racist just gonna be changed now?
Dunno about that. It's an international culture and the "Fuck Amerilibs" camp is quite strong. See on LKML. OTOH, the other side seems willing to pull Media blitzes, e.g. the ZDNet article and the fact that the always roll in their PRs saying "look, so and so already cucked to me" even when it's not true, which is quite powerful in this post-truth era that we live in. They might just steamroll the entire thing. Also cant believe how shit their citation is.
 

cecograph

kiwifarms.net
I don't see a problem here. The "master" branch default doesn't make much sense these days and I agree that it's not inclusive enough and likely to offend African Americans. Here's how I'd change things:

Code:
SYNOPSIS
       git init [-q | --quiet] [--bare] [--template=<template_directory>]
                 [--separate-git-dir <git dir>]
                 [--shared[=<permissions>]] <default branch name> [directory]

DESCRIPTION
       This command creates an empty Git repository - basically a .git directory with
       subdirectories for objects, refs/heads, refs/tags, and template files. An initial HEAD
       file that references the HEAD of the branch named <default branch name> is also 
       created.

EXAMPLES
       Start a new Git repository for an existing code base

               $ cd /path/to/my/codebase
               $ git init niggerfaggot    (1)
               $ git add .                (2)
               $ git commit               (3)
 

Pepsi-Cola

Fuck Cumrobbery!
True & Honest Fan
kiwifarms.net
What do you think, Is the self taught programmer a meme or it can be done?
not a meme but like any skill that takes a huge time investment to become proficient in, if you don't enjoy what you're doing its hard to stay motivated to continue learning it when you have nothing to hold you accountable for progress.

with that being said the good news is there's tons of free resources there to help you learn. Python and JavaScript have tons of beginner friendly tutorials and books and shit.
 

moocow

Moo.
True & Honest Fan
kiwifarms.net
Is everything that can even slightly be construed as racist just gonna be changed now?
And it won't be enough. Even if they manage to change every fucking thing they've ever been offended by, they'll just find new things to be offended by to demand they change as well.

Github can go fuck themselves. My repos will continue to have/use the "master" branch like they always have.
 

SickNastyBastard

::::::::::::::::::::::::::::::::::::::::::::::::::
True & Honest Fan
kiwifarms.net
The pitfall of self-taught programming is that you're likely to teach yourself some bad coding habits that you'll have to break later. Of course that's also just part of the learning process, so it's not entirely bad. Even if you know how to code, though, if you start at someplace new you're going to have to learn their coding standards and how to find your way around their toolchain and codebase.
I agree, but I'd like to say that schools can teach you bad coding habits, jobs can teach you bad coding habits. Every job I had had a different way of writing, source control and programming style.
I found these to be a good foundation:
1.) decouple what you can
2.) make it as simple as possible
3.) error handle/log the shit out of it
4.) make it scalable
5.) make your you have a rock solid integration process for source control
7.) make it easy to read and know these db differences.
8.) write out automated tests, if not in the project, enforce them in continuous integration, and be able to roll back easily
9.) Always have you backup the backup's backup
8.) Above all those though, make sure you know exactly what you intend to do before starting, look at the take of readme driven development. Make sure when someone asks why you used what you did you can given an overly detailed explanation, and take the future into account


I've seen arguments about just about everything about stacks down to 13 pages on binary operators. That's excessive. Oh! Document shit for you and the next guy because you'll forget why you made it the way you did and the next guy will have an easier time getting the purpose.
 

AmpleApricots

kiwifarms.net
Most of the pioneers were self-thought. Are things more complex now? Yes. More difficult? I'm not so sure. For starters, you have to know nothing about computers to write something that'll actually run at any usable speed.

Apropos knowing nothing about anything: I stumbled into golang and I like it. It does away with things I'm fine doing away with in current year. It kinda reminds me of lua which I always liked with this focus on minimalism. (there are also some other similarities) I started writing sort of a TUI-Toolbox on top of termbox for it and it all just sorta writes itself. (yeah I know there are already ready made ones but they have way too many dependencies) I feel the entire time I'll eventually stumble across something that'll make it absolutely horrid but it hasn't happened so far.
 
Last edited:

Dandelion Eyes

kiwifarms.net
huge time investment
Yeah, that's the keyword. Cause it seems that a lot of people think something along the lines of:"Programming? Piece of cake! I'll just do this online course on Node.js and I'll become a professional in no time!" And then these newbies apply for jobs and fail to answer the most basic questions, if they get through the screening process at all.

I don't mean to say that coding is exceptionally hard(although at times it seems that you gotta have a special autistic mindset to be a coder), you'll still have to invest years to learn the theory and improve your skills, to be worth anything.

...also I think I'm gonna call my master branch on Github a "slaver" from now on just to spite them.
 

ConcernedAnon

Concerned and hopefully anonymous
kiwifarms.net
So I was rereading the thread and I don't know why I didn't reply to this then, but what was this? I have no idea what the code block in the middle is supposed to be aside from out of context nonsense, but is this supposed to be suggesting that hardware threading isn't provided by windows? That at least is provably false.
I have 25+ years of writing internals software. It is obvious you don't know what a scheduler is. (hint it does't matter what the language is)

Kiss my ass and learn to use SoftIce (old school) or windbg.

_asm cli
(NTCREATEFILE)(SYSTEMSERVICE(ZwCreateFile))=NewNtCreateFile;
_asm sti

Code:
NTSTATUS NewNtCreateFile(
                    PHANDLE FileHandle,
                    ACCESS_MASK DesiredAccess,
                    POBJECT_ATTRIBUTES ObjectAttributes,
                    PIO_STATUS_BLOCK IoStatusBlock,
                    PLARGE_INTEGER AllocationSize OPTIONAL,
                    ULONG FileAttributes,
                    ULONG ShareAccess,
                    ULONG CreateDisposition,
                    ULONG CreateOptions,
                    PVOID EaBuffer OPTIONAL,
                    ULONG EaLength)
{
    
NTSTATUS rc;

  
// you can do whatever you want here

      
     
        return rc;
}
Get bent.


Yes i'm being an asshole.
Simple C# test to demonstrate the benefits of multithreading;
C#:
static void Main(string[] args)
{
    const int sumCount = 1_000_000; // Number of times to add to the sum
    const int tryCount = 10; // Number of tries, just to even out performance
    const int threadCount = 16; // Number of threads for the threaded test
    // If you aren't on a ryzen use a lower count to get more representative results


    decimal sum = 0; // Our sum, decimal so that it can contain the result
    object syncRoot = new object(); // Used to synchronize the sum
    Stopwatch timer = new Stopwatch();

    Action testFunc = () => // It's lazy, but it's easier to write the test apparatus with lambdas instead of a for-purpose class, doesn't matter though, they compile to the same thing
    {
        decimal localSum = 0;
        for (int i = 0; i < sumCount; i++) // Sum locally to avoid contention
        {
            localSum += i;
        }

        lock (syncRoot)
            sum += localSum; // Add the local sum under a lock. Happens once, allowing the vast majority of work to happen freely
    };


    Action resetTest = () =>
    {
        sum = 0;
        timer.Reset();
    };

    SemaphoreSlim semaphore = new SemaphoreSlim(0, threadCount);
    ThreadStart threadedTestFunc = () =>
    {
        semaphore.Wait();
        testFunc();
    };


    long syncTime = 0, asyncTime = 0;
    for (int j = 0; j < tryCount; j++) // Run our tests
    {
        {
            timer.Start(); // Run our test synchronously
            for (int t = 0; t < threadCount; t++)
                testFunc();
            timer.Stop();
            syncTime += timer.ElapsedMilliseconds;
        }

        resetTest();

        {
            Thread[] threads = new Thread[threadCount];
            for (int t = 0; t < threadCount; t++) // Set up our async test
            {
                threads[t] = new Thread(threadedTestFunc);
                threads[t].Start(); // Run it, it'll wait for the semaphore
            }

            timer.Start(); // Start recording before releasing
            semaphore.Release(threadCount);
            for (int t = 0; t < threadCount; t++)
                threads[t].Join(); // Wait for results before finishing our recording
            timer.Stop();
            asyncTime += timer.ElapsedMilliseconds;
        }

        resetTest();
    }

    syncTime /= tryCount; // Normalize by test count
    asyncTime /= tryCount;

    Console.WriteLine($"Synchronous test took {syncTime}ms");
    Console.WriteLine($"Asynchronous test with 16 threads took {asyncTime}ms");
    Console.WriteLine($"Performance ratio was {(float)syncTime/(float)asyncTime} with {threadCount} threads");
}
The results were as follows;
Synchronous test took 412ms
Asynchronous test with 16 threads took 34ms
Performance ratio was 12.117647 with 16 threads
Keep in mind that 8 of those 16 threads aren't hardware threads per say, rather they just have resource sharing and fast context switching, so 12 times faster completion is actually better than expected.
I have no idea what this argument about the scheduler is, being that it doesn't change the fact that the task was finished faster by more threads anyway, and in a fashion roughly proportional to the number of threads. I mean, note that the program is actually designed to avoid contention and so it actually runs concurrently, if the argument is that you don't get a performance benefit when everything has to be accessed under mutex, or through streams then yeah, no shit.

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

In general, it seems like there's a lot of fear directed towards concurrent programming, but processors are getting more cores, not faster, so everyone's going to have to adapt at some point. It's not that bad once you get your feet on the ground. Main principle is either to protect things by limiting their access, or failing that by mutex. Where applicable you want to leverage atomics, but those should usually be used in well tested generic algorithms, not developed for-purpose unless it's really justified or trivial.

On the topic of schedulers, my current project is a scheduler of sorts, meant to order and execute jobs asynchronously. The main principles of it are waypoints and syncpoints. Waypoints order execution, first they are ordered against each other by contracts such as before x, during x, or after x, and then jobs are ordered against waypoints by similar contracts. Syncpoints determine when it is safe to execute a scheduled job, with each job having it's own set of syncpoints which are checked against the syncpoints of running jobs for conflict. This has the benefit of eliminating the risk of deadlock, while allowing unbound parallel performance, and allowing a scalable means of ordering tasks.

The immediate intended application of this is a game engine with an entity based scheme. As an example, a particular ai job might be scheduled to occur during the ai-main waypoint, and would specify as syncpoints that it needs readonly access to component data, navigation, and transforms, and that it needs full access to a subset of the ai system. So then ai jobs affecting unentangled types of ai would be free to run concurrently, and without having to state an explicit ordering. The systems themselves need to be concurrency safe of course, but the jobs using them are free to eschew locks in favor of contracts. If that sounds too difficult, specifying no syncpoints would cause the job to run synchronously, and then at a later point a more specific set could be specified to ease restrictions.
 

Piss Clam

Squeeze me.
kiwifarms.net
I have no idea what the code block in the middle is supposed to be aside from out of context nonsense, but is this supposed to be suggesting that hardware threading isn't provided by windows? That at least is provably false.
Nope you misunderstand. The code block is just an example of kernel coding. What I was talking about was scheduling. The fact you use C# is kinda amusing, be eh, you get the scheduling for free, now don't you.


Didn't I tell you I was being an asshole in that post, also know as shitposting.
 
  • Agree
  • Mad at the Internet
Reactions: Marvin and SIGSEGV
Tags
None