The learning web


The Learning Web | Ithaca, NY

Program areas at The Learning Web

The youth outreach program. Youth outreach offers support and assistance to young people, ages 16-25, who are homeless and living on their own without a parent or guardian. These homeless young people receive help for all aspects of independent living with The goal of moving with confidence toward self-sufficiency.

The youth exploration program. Youth exploration offers community Learning experiences to youth, age 12 and older, who are living at home and attending school. Tailored to each young person's interests, youth engagement programs include career exploration tours, apprenticeships, and community service.

Who funds The Learning Web

Grants from foundations and other nonprofits

GrantmakerGrantmaker tax periodDescriptionAmount
United Way of Tompkins County2020-06General Assistance$47,210
Fidelity Investments Charitable Gift Fund2020-06For Grant Recipient's Exempt Purposes$10,550
Marvin and Annette Lee Foundation2019-12To Support Charitable Activities$300
. ..and 1 more grant received

Personnel at The Learning Web

NameTitleCompensationDate of data
Danielia BarronExecutive Director$15,7922020-12-31
Rick AlvordDirector of Operations2022-08-18
Charlene SantosDirector Program Leader , Youth Employment Service Yes Ithaca Youth Bureau2022-08-18
Viki McDonaldSecretary / Board Member$02022-08-18
Kristine M. DelucaPresident of the Board / President / Board Member$02022-08-18
...and 5 more key personnel

Financials for The Learning Web

  • Revenues
  • Expenses
  • Assets
  • Liabilities
RevenuesFYE 12/2020FYE 12/2019% Change
Total grants, contributions, etc. $931,738$853,4469.2%
Program services$0$0-
Investment income and dividends$0$0-
Tax-exempt bond proceeds$0$0-
Royalty revenue$0$0-
Net rental income$0$0-
Net gain from sale of non-inventory assets$0$0-
Net income from fundraising events$0$-500-100%
Net income from gaming activities$0$0-
Net income from sales of inventory$0$0-
Miscellaneous revenues$201$1,949-89. 7%
Total revenues$931,939$854,8959%

Form 990s for The Learning Web

Fiscal year endingDate received by IRSFormPDF link
2020-122021-11-09990View PDF
2019-122021-02-17990View PDF
2018-122019-10-31990View PDF
2016-122017-10-23990View PDF
2015-122016-09-12990View PDF
...and 6 more Form 990s

Organizations like The Learning Web

OrganizationTypeLocationRevenue
Rise Up Reno Prevention Network501(c)(3)Hutchinson, KS$388,758
Rancho Cielo501(c)(3)Salinas, CA$3,651,827
Arapahoe Philharmonic501(c)(3)Littleton, CO$319,439
Bertie County Ymca501(c)(3)Windsor, NC$285,482
Junior Achievement of Central Illinois501(c)(3)East Peoria, IL$755,365
Junior Achievement of Maine501(c)(3)Portland, ME$705,156
Homewood Children's Village501(c)(3)Pittsburgh, PA$2,399,071
Whiz Kids Tutoring501(c)(3)Denver, CO$777,654
The Zuni Youth Enrichment Project501(c)(3)Zuni, NM$1,513,537
Milan Family YMCA501(c)(3)Milan, TN$388,990

Data update history

August 19, 2022

Updated personnel

Identified 4 new personnel

July 7, 2022

Posted financials

Added Form 990 for fiscal year 2020

July 2, 2022

Updated personnel

Identified 1 new personnel

August 22, 2021

Posted financials

Added Form 990 for fiscal year 2019

July 22, 2021

Updated personnel

Identified 8 new personnel

Nonprofit Types

Civic / social organizationsYouth development programsCharities

Issues

Human servicesChildrenHousingHomelessnessJobs and employment

Characteristics

State / local levelReceives government fundingCommunity engagement / volunteeringTax deductible donations

General information

Address
515 W Seneca St
Ithaca, NY 14850
Metro area
Ithaca, NY
County
Tompkins County, NY
Website URL
learning-web. org/ 
Phone
(607) 275-0122

IRS details

EIN
16-1494941
Fiscal year end
December
Taxreturn type
Form 990
Year formed
1996
Eligible to receive tax-deductible contributions (Pub 78)
Yes

Categorization

NTEE code, primary
O00: Youth Development: General
NAICS code, primary
813410: Civic and Social Organizations
Parent/child status
Independent

Blog articles

  • Impact of COVID-19 on Nonprofits
  • Private Foundation Excise Taxes on Undistributed Income
  • Foundation Grants to Individuals
  • COVID-19 Grants to Nonprofits
  • Gifts from Private Foundations to Donor Advised Funds

Free account sign-up

Want updates when The Learning Web has new information, or want to find more organizations like The Learning Web?

Create free Cause IQ account

The Learning Network - The New York Times

Teach and learn with The Times: Resources for bringing the world into your classroom

Teach and learn with The Times: Resources for bringing the world into your classroom

Highlights

  1. PhotoCreditFinbarr O’Reilly for The New York Times

    What’s Going On in This Picture?

    Look closely at this image, stripped of its caption, and join the moderated conversation about what you and other students see.

     By The Learning Network

  2. PhotoCreditJordan Strauss/Invision, via Associated Press

    Student Opinion

    Some think the prizes for best male artist and best female artist should simply be “best artist.” Would this approach be more inclusive, or less?

     By Shannon Doyne

    1. PhotoCreditAdali Schell for The New York Times

      current events conversation

      Teenagers weigh in on why young people are reporting record levels of hopelessness.

       By The Learning Network

    2. PhotoCreditCaleb Washington

      vocabulary

      We invite students to create a short video that defines or teaches any of the words in our Word of the Day collection. Contest dates: Feb. 15-March 15, 2023.

       By The Learning Network

  1. Student Opinion

    PhotoCredit

    Have you ever dreamed of turning a following into a career?

     By Natalie Proulx

  2. Current Events

    PhotoCreditEmile Ducke for The New York Times

    A collection of ideas grounded in Times resources to help students reflect on a year of war, consider its causes and effects, and ponder what’s next.

     By The Learning Network

  3. Student Opinion

    PhotoCreditTyler Hicks/The New York Times

    It has been one year since Russia invaded Ukraine. How, if at all, has this crisis hit home for you?

     By Katherine Schulten and Natalie Proulx

  4. Student News Quiz

    PhotoCreditMC1 Tyler Thompson/U.S. Navy

    Have you been paying attention to the news in February? See how many of these questions you can get right.

     Compiled by Jeremy Engle

  5. Contests

    PhotoCreditClockwise from top left: Charlie Ballenger, Zubin Carvalho, Amina Bilalova, Osayamen Okungbowa, Jenny Zou and Courtney Duffy

    This year’s lineup mixes classic challenges with new opportunities.

     By The Learning Network

Advertisement

Continue reading the main story

More in Resources for Teaching and Learning ›
  1. PhotoCredit

    Lesson plans and teaching resources based on Times content

     

  2. PhotoCredit

    Student Opinion Q’s, Picture Prompts & Current Events Conversation

     

  3. PhotoCredit

    Weekly News Quiz, Word of the Day, Country of the Week and Student Crosswords

     

  4. PhotoCredit

    Film Club, What’s Going On in This Picture? and What’s Going On in This Graph?

     

  5. PhotoCredit

    Student Contests & Our Contest Calendar

     

More in Resources by Subject ›
  1. PhotoCredit

    Lesson plans for English language arts from The Learning Network.

     

  2. PhotoCredit

    Lesson plans for social studies from The Learning Network.

     

  3. PhotoCredit

    Lesson plans for science and math from The Learning Network.

     

  4. PhotoCredit

    Lesson plans for E.L.L. & arts from The Learning Network.

     

  5. PhotoCredit

    Lesson plans on current events from The Learning Network.

     

Advertisement

Continue reading the main story

More in Accessible Activities ›
  1. PhotoCreditAlex Fraser/Reuters

    What unusual food combinations do you love?

     By The Learning Network

  2. PhotoCreditZack Haskell for The New York Times

    How does life go on after a gun violence tragedy?

     By The Learning Network

  3. PhotoCreditNiklas Wesner

    Tell us a story, real or made up, that is inspired by this image.

     By The Learning Network

  4. PhotoCreditErin Schaff/The New York Times

    Look closely at this image, stripped of its caption, and join the moderated conversation about what you and other students see.

     By The Learning Network

  5. PhotoCreditAgence France-Presse — Getty Images

    Look closely at this image, stripped of its caption, and join the moderated conversation about what you and other students see.

     By The Learning Network

  1. Student Opinion

    How Do You Like to Be Comforted When You Are Sad?

    What can others do or say to make you feel better when you are down?

    By Natalie Proulx

     

  2. Film Club

    Learning With The Times’s ‘Anatomy of a Scene’

    What does it take to make movie magic?

    By The Learning Network

     

  3. Word of the day

    Word of the Day: pecuniary

    This word has appeared in six articles on NYTimes.com in the past year. Can you use it in a sentence?

    By The Learning Network

     

  4. Word of the day

    Word of the Day: fusillade

    This word has appeared in 21 articles on NYTimes.com in the past year. Can you use it in a sentence?

    By The Learning Network

     

  5. Picture Prompts

    Wild Creatures

    Tell us a story, real or made up, that is inspired by this GIF.

    By The Learning Network

     

  6. Word of the day

    Word of the Day: desolation

    This word has appeared in 35 articles on NYTimes.com in the past year. Can you use it in a sentence?

    By The Learning Network

     

  7. What’s Going On in This Graph? | March 8, 2023

    What do you notice and wonder about the location and number of derailed train cars? What do you wonder about derailments?

    By The Learning Network

     

  8. Picture Prompts

    Art and War

    These pieces are made by children of Ukraine. What do you think they are communicating?

    By The Learning Network

     

  9. Word of the day

    Word of the Day: usurp

    This word has appeared in 33 articles on NYTimes.com in the past year. Can you use it in a sentence?

    By The Learning Network

     

  10. Contests

    Our 10th Annual Student Editorial Contest

    We invite students to write opinion pieces on the issues that matter to them. Contest dates: March 15 to April 12, 2023.

    By The Learning Network

     

Neural networks for beginners. Part 1 / Habr

Hello to all readers of Habrahabr, in this article I want to share with you my experience in studying neural networks and, as a result, their implementation using the Java programming language on the Android platform. My introduction to neural networks came when the Prisma app came out. It processes any photo using neural networks and reproduces it from scratch using the selected style. Having become interested in this, I rushed to look for articles and "tutorials", first of all, on Habré. And to my great surprise, I did not find a single article that clearly and step by step painted the algorithm for the operation of neural networks. The information was scattered and missing key points. Also, most authors rush to show code in one or another programming language without resorting to detailed explanations.

Therefore, now that I have mastered neural networks quite well and found a huge amount of information from various foreign portals, I would like to share this with people in a series of publications where I will collect all the information that you need if you are just starting to get acquainted with neural networks. networks. In this article, I will not put a strong emphasis on Java and will explain everything with examples so that you yourself can transfer it to any programming language you need. In subsequent articles, I will talk about my application, written for android, which predicts the movement of stocks or currencies. In other words, everyone who wants to plunge into the world of neural networks and crave a simple and accessible presentation of information, or just those who do not understand something and want to pull it up, welcome under cat.

My first and most important discovery was the playlist of the American programmer Jeff Heaton, in which he analyzes in detail and clearly the principles of neural networks and their classification. After watching this playlist, I decided to create my own neural network, starting with the simplest example. You probably know that when you first start learning a new language, your first program will be Hello World. It's kind of a tradition. The world of machine learning also has its own Hello world and this is a neural network that solves the XOR problem. The exclusive or table looks like this:

a b c
0 0 0
0 1 1
1 0 1
1 1 0

Accordingly, the neural network takes two numbers as input and should give another number at the output - the answer. Now about the neural networks themselves.

What is a neural network?


A neural network is a sequence of neurons connected by synapses. The structure of the neural network came to the world of programming straight from biology. Thanks to this structure, the machine acquires the ability to analyze and even memorize various information. Neural networks are also capable of not only analyzing incoming information, but also reproducing it from their memory. Those interested should definitely watch 2 videos from TED Talks: Video 1, Video 2). In other words, a neural network is a machine interpretation of the human brain, which contains millions of neurons that transmit information in the form of electrical impulses.

What are neural networks?

For now, we will consider examples on the most basic type of neural networks - this is a feedforward network (hereinafter referred to as FDN). Also in subsequent articles, I will introduce more concepts and tell you about recurrent neural networks. SRL, as the name implies, is a network with a serial connection of neural layers, in which information always flows in only one direction.

Why do we need neural networks?

Neural networks are used to solve complex problems that require analytical calculations similar to what the human brain does. The most common applications of neural networks are:

Classification - distribution of data by parameters. For example, a set of people is given as input and you need to decide which of them to give a loan and who not. This work can be done by a neural network, analyzing such information as: age, solvency, credit history, etc.

Prediction - the ability to predict the next step. For example, the rise or fall of stocks, based on the situation in the stock market.

Recognition is currently the most widely used neural network. Used in Google when you search for a photo or in phone cameras when it detects the position of your face and highlights it and more.

Now, to understand how neural networks work, let's take a look at its components and their parameters.

What is a neuron?


A neuron is a computational unit that receives information, performs simple calculations on it, and passes it on. They are divided into three main types: input (blue), hidden (red) and output (green). There is also a bias neuron and a context neuron, which we will talk about in the next article. In the case when the neural network consists of a large number of neurons, the term layer is introduced. Accordingly, there is an input layer that receives information, n hidden layers (usually no more than 3) that process it, and an output layer that displays the result. Each of the neurons has 2 main parameters: input data (input data) and output data (output data). In the case of an input neuron: input=output. In the rest, the input field contains the total information of all neurons from the previous layer, after which it is normalized using the activation function (for now, just imagine it f (x)) and enters the output field.

It is important to remember that neurons operate on numbers in the range [0,1] or [-1,1]. But how, you ask, then handle numbers that are out of this range? At this stage, the easiest answer is to divide 1 by that number. This process is called normalization and it is very commonly used in neural networks. More on this a little later.

What is a synapse?


A synapse is a connection between two neurons. Synapses have 1 parameter - weight. Thanks to him, the input information changes when it is transmitted from one neuron to another. Let's say there are 3 neurons that pass information to the next one. Then we have 3 weights corresponding to each of these neurons. For the neuron with the greater weight, that information will be dominant in the next neuron (an example is color mixing). In fact, the set of neural network weights or the weight matrix is ​​a kind of brain of the entire system. It is thanks to these weights that the input information is processed and turned into a result.

It is important to remember that during initialization of the neural network, the weights are randomized.

How does a neural network work?


In this example, a part of a neural network is depicted, where the letters I represent the input neurons, the letter H represents the hidden neuron, and the letter w represents the weights. It can be seen from the formula that the input information is the sum of all input data multiplied by their corresponding weights. Then we will give input 1 and 0. Let w1=0.4 and w2 = 0.7 The input data of neuron H1 will be the following: 1*0.4+0*0.7=0.4. Now that we have the input, we can get the output by plugging the input into the activation function (more on that later). Now that we have the output, we pass it on. And so, we repeat for all layers until we reach the output neuron. Running such a network for the first time, we will see that the answer is far from correct, because the network is not trained. To improve the results, we will train her. But before we learn how to do this, let's introduce a few terms and properties of a neural network.

Activation function

An activation function is a way to normalize the input (we've talked about this before). That is, if you have a large number at the input, passing it through the activation function, you will get an output in the range you need. There are a lot of activation functions, so we will consider the most basic ones: Linear, Sigmoid (Logistic) and Hyperbolic tangent. Their main difference is the range of values.

Linear

This function is almost never used, except when you need to test a neural network or pass a value without transformations.

Sigmoid

This is the most common activation function, its value range is [0,1]. It shows most of the examples on the web, and is also sometimes called the logistic function. Accordingly, if in your case there are negative values ​​(for example, stocks can go not only up, but also down), then you need a function that captures negative values ​​as well.

Hyperbolic tangent

It only makes sense to use hyperbolic tangent when your values ​​can be both negative and positive, since the range of the function is [-1,1]. It is not advisable to use this function only with positive values, as this will significantly worsen the results of your neural network.

Training set

A training set is a sequence of data that a neural network operates on. In our case of exclusive or (xor) we have only 4 different outcomes, that is, we will have 4 training sets: 0xor0=0, 0xor1=1, 1xor0=1,1xor1=0.

Iteration

This is a kind of counter that increases every time the neural network goes through one training set. In other words, this is the total number of training sets passed by the neural network.

Epoch

When initializing the neural network, this value is set to 0 and has a manual ceiling. The longer the epoch, the better the network is trained and, accordingly, its result. The epoch increases every time we go through the entire set of training sets, in our case, 4 sets or 4 iterations.

Important not to confuse iteration with epoch and understand the sequence of their increment. First n
once the iteration increases, and then the epoch and not vice versa. In other words, you cannot first train a neural network on only one set, then on another, and so on. You need to train each set once per era. So, you can avoid errors in calculations.

Error

Error is a percentage that reflects the discrepancy between expected and received responses. The error is formed every epoch and should decline. If it doesn't, then you're doing something wrong. The error can be calculated in different ways, but we will consider only three main methods: Mean Squared Error (hereinafter referred to as MSE), Root MSE and Arctan. There is no restriction on usage like there is in the activation function, and you are free to choose whichever method gives you the best results. One has only to take into account that each method counts errors differently. With Arctan, the error will almost always be larger, since it works on the principle that the larger the difference, the larger the error. The Root MSE will have the smallest error, so the most commonly used MSE is the one that keeps the balance in the error calculation.

MSE

Root MSE

Arctan

The principle of error calculation is the same in all cases. For each set, we count the error, subtracting from the ideal answer received. Further, we either square or calculate the square tangent from this difference, after which we divide the resulting number by the number of sets.

Task

Now, to test yourself, calculate the result given by the neural network using sigmoid and its error using MSE. 92)/1=0.45

The result is 0.33, the error is 45%.

Thank you very much for your attention! I hope that this article was able to help you in the study of neural networks. In the next article, I will talk about bias neurons and how to train a neural network using backpropagation and gradient descent.

Used resources:
- Raz
— Two
— Three

KNOW INTUIT | Lecture | Methods of classification and forecasting. Neural networks

< Lecture 28 || Lecture 11 : 1234567

Abstract: The lecture describes the method of neural networks. The elements and architecture, the learning process and the phenomenon of neural network retraining are considered. Such a neural network model as a perceptron is described. An example of solving the problem using the apparatus of neural networks is given.

Keywords: neural network, neural network, directed graph, automation, identification, recognition, self-learning system, layer, information, Data, unsupervised learning, self-organizing map, database, definition, artificial neuron, neuron, function, nonlinear converter , activation function, output, branch point, synapse, axon, connection, activation function, software, layered neural network, perceptron, basis functions, cognitron, neocognitron, queue, network, input, output, hidden, input neuron, input, neuron, hidden neuron, hidden, output neuron, output, internal parameters, algorithm, epoch, iteration, sets, training sample, linear model, analyst, difference, learning error, error function, objective function, overfitting, prediction accuracy, division, MLP, method backpropagation, propagate, discrepancy, software, credit, area, user, renderer, table, graph, analysis, value, toolbox, variable network, backpropagation network, syntax, array, SI, least squares, net, SSE, MAT, construction

The idea of ​​neural networks was born within the theory of artificial intelligence, as a result of attempts to imitate the ability of biological nervous systems to learn and correct errors.

Neural networks (Neural Networks) are models of biological neural networks of the brain, in which neurons are imitated by relatively simple, often of the same type, elements (artificial neurons).

A neural network can be represented by a directed graph with weighted connections, in which artificial neurons are vertices, and synaptic connections are arcs.

Neural networks are widely used to solve various problems.

Among the areas of application of neural networks are the automation of pattern recognition processes, forecasting, adaptive control, the creation of expert systems, the organization of associative memory, the processing of analog and digital signals, the synthesis and identification of electronic circuits and systems.

Using neural networks, you can, for example, predict product sales, stock market performance, perform signal recognition, design self-learning systems.

Models of neural networks can be software and hardware. We will consider networks of the first type.

In simple terms, a layered neural network is a collection of neurons that make up layers. In each layer, the neurons are not connected to each other in any way, but are connected to the neurons of the previous and next layers. Information comes from the first to the second layer, from the second to the third, and so on.

Among the Data Mining tasks solved with the help of neural networks, we will consider the following:

  • Classification (supervised learning). Examples of classification tasks: text recognition, speech recognition, personal identification.
  • Forecasting. For a neural network, the prediction problem can be formulated as follows: find the best approximation of a function given a finite set of input values ​​(training examples). For example, neural networks allow you to solve the problem of recovering missing values.
  • Clustering (unsupervised learning). An example of a clustering task can be the task of compressing information by reducing the data dimension. Clustering problems are solved, for example, by self-organizing Kohonen maps. These networks will be the subject of a separate lecture.

Let's consider three examples of tasks for which it is possible to use neural networks.

Medical diagnostics . In the course of monitoring various indicators of the condition of patients, a database has been accumulated.


Learn more